Monday, July 31, 2006

Presbyterian publisher tackles 9/11 conspiracy

How seriously should we take the idea that the 9/11 attacks look like an inside job?

Very seriously, says a publishing arm of the Presbyterian Church USA. "Christian Faith and the Truth Behind 9/11" by David Ray Griffin, the noted 9/11 critic and theologian, is being published by Westminster John Knox Press, a division of Presbyterian Publishing.

Jack Keller, a vice president of the publishing house, told Christianity Today that Griffin raised some "interesting questions" that warranted attention. The publisher has printed a number of Griffin's theological works over the years but Griffin's previous 9/11 books were published elsewhere.

One scholar questioned by Christianity Today attributed the church publisher's decision to publish Griffin to an anti-Israel bias.
__________________________
Check out the bioterror series by Joby Warrick at the Washington Post. The concerns raised in my article are covered here, but with a bit more polish and some detail that I did not have.

Friday, July 28, 2006

Media wars

Essentially, American and British media are now instruments of an international oligarchy. Most media reach is in the hands of a few. But those few still think about like robber barons of yore, continually pushing to expand their reach and protect their positions.

They are, though, having a tough time with the internet, which is cutting into their ad revenues. Still, it is probable that big investors in internet powerhouses such as Google, Yahoo, AOL and so on are elbowing their way to the table of the oligarchs.

Clearly, one of the things the oligarchs do is agree as to what news is not to be disseminated. They may do this with encrypted phone calls or perhaps with a wink and a nod. Those who are not under the control of these oligarchs find that they are pressured to play along and ignore news that has been put on the oligarchy's spike list, such as objective reporting of 9/11 coverups.

So what does the oligarchy do when confronted with the phenomenon of Znewz1? Sure there are plenty of other blogs and websites that question the system. But Znewz1 is run be a former newspaperman who knows the ways of the news business. This reporter has contributed to the New York Times and worked for other major newspapers.

What really bothers the oligarchy is competition!

These guys stay awake nights trying to figure out how to squelch competition.

Now Znewz1 doesn't provide any direct economic competition. But competition for credibility does affect economic standing (as I can well attest).

I conjecture that big media jealousy and anti-competitive policies have a lot to do with the continual interference with Znewz1 and the ongoing campaign to limit Paul Conant's work from gaining visibility on the net. You may remember how media biggies have cozied up to the White House in a cloying bid to get legislation that would leave them even less competition.

This said, we must also praise those reporters and editors who do what they can to overcome the oligarchic yoke.

Can you read the sidebar?

I don't know whether you can see the sidebar content, but I can't.

The sidebar went blank after a reader said he checked a twin towers link on the sidebar and the link returned a "URL not found." He thought this bum URL was of interest in light of my links being systematically banished from Wiki.

Obviously, when I put them up, I checked the sidebar links to make sure they worked. Also, I checked the blog code to make sure the sidebar instructions hadn't been erased. They hadn't.

Well, at least I know we're not exactly getting the ho-hum treatment. But this sort of harassment has been going on for a long time. I don't accept it but I'm awfully accustomed to it.

Please write me at znewz_1@yahoo.com and let me know whether you can read the sidebar or let me know of other problems you encounter. Thanks.

Wednesday, July 26, 2006

9/11 probe ignored 'free fall' issue

All three World Trade Center buildings fell at near the rate of free fall, which is the maximum that can occur without resistance. In other words, both structural resistance and air resistance were negligible.

It is well known that buidlings felled by controlled demolition tend to fall near the free-fall rate because lower supports are blown out and so there is little structural resistance. The rates of fall is something that set off alarm bells in some quarters.

Considering the enormous potential energy in the top sections of the twin towers, a fall rate near free fall might be plausible. In fact, I once did a back-of-the-envelope differential equation, whereby I estimated the structural resistance based on steel-to-air ratio and found that the results, for both buildings, were near the free fall rate. But, my calculation was no substitute for a computer model.

BUT, the National Institutes of Standards and Technology simply ignored the issue. The NIST computer models only reached the point where "global collapse" was triggered. The scientists ran no computer model of the actual fall, meaning that the collapse rate was not faced. Yet, the government knew that the rate-of-fall question was high in the minds of a number of experts.

And, they published no other data or expert opinion concerning the rates of fall.

In addition, the NIST's report on the twin towers went to press without a report on the day's seismic activity, which was squelched without explanation. (See my trade center links.)

NIST officials have tried to cope with WTC7's free-fall puzzle by supposing that the building collapse began at a low floor, but that probe is still under way. The contractor designated to generate a computer model of the WTC7 collapse, based solely on NIST input values, does not have authority to do a simulation that includes rate of fall.

Now it is possible that the simulation of the lead-up to collapse could not include the events of collapse. But there is no word as to why separate simulations covering the actual
collapses weren't feasible. Nor is there any opinion presented as to the probabilities of such rates.

Tuesday, July 25, 2006

Two more Wiki nixes

I've found that at least two more of my links have been bounced from Wiki. It appears that someone is chasing my links around, revamping the Wiki page and banishing my stuff.

One banished link is my take on the twin paradox found at http://www.angelfire.com/az3/nfold/twins.html

Another concerns my short version of the set theoretic Schroeder-Bernstein theorem.

I noticed a front page article in the New York Times sometime after I added my link that some pages, including the related Einstein page, had become untouchable because of disputes over editorial control. However, it appears to me that a link is not a big deal, unless your aim is personal. People know that internet writings have to be viewed carefully. It's really not up to some invisible force (perhaps a Wiki Administrator or perhaps someone with a flair for computer manipulation) to determine that some link or other should be removed and blocked from being restored.

A quick look shows that the page was extensively re-edited to throw out the thoughts of various people.

The Schroeder-Bernstein page has been sanitized of all links, including mine.

The "line" is taking over Wiki, it appears. It won't be long before all sorts of material on the government's 9/11 fraud will be sanitized, distorted and otherwise made of little use by someone acting behind the scenes to "get control."

BTW, no response has been received to an email sent to Wiki founder Cunningham asking what was going on.

Another Wiki nix

Another of my links has been bounced from Wiki. It appears that someone is chasing my links around, revamping the Wiki page and banishing my stuff.

The banished link is my take on the twin paradox found at http://www.angelfire.com/az3/nfold/twins.html

I noticed a front page article in the New York Times sometime after I added my link that some pages, including the related Einstein page, had become untouchable because of disputes over editorial control. However, it appears to me that a link is not a big deal, unless your aim is personal. People know that internet writings have to be viewed carefully. It's really not up to some invisible force (perhaps a Wiki Administrator or perhaps someone with a flair for computer manipulation) to determine that some link or other should be removed and blocked from being restored.

A quick look shows that the page was extensively re-edited to throw out the thoughts of various people. The "line" is taking over Wiki, it appears.

Monday, July 24, 2006

Fight for the truth about 9/11!

Sixty-one Wisconsin state lawmakers are demanding the head of a professor who has challenged the official claims concerning 9/11.

At the same time 911truth.org is conducting a poll of political candidates to see who favor a new probe of 9/11 on grounds that previous probes are discredited.

Well obviously these Wisconsin lawmakers wouldn't dare play this game without the unholy silence of the Wisconsin media. I mean, thank God they printed the professor's remarks. But where is their moxy in going after the truth? Ha.

Anyway, here's an idea: fight back with an ad campaign that targets government lies about 9/11. True, some media will refuse to run the ads. But try, try. Somebody might run them. For example, full-page or even half-page ads in weekly newspapers is a possibility. Pick out a salient fact, such as the point that the government withheld almost all data concerning the numerous explosions in the twin towers.

Censored by Wiki

OK, maybe it's someone who thinks my math isn't right. But, I think it's more likely the motive is political.

My "Monty Hall over easy" page had been getting a fair number of hits, many directed through the Wikipedia article on the Monty Hall problem. However, someone has re-edited the page (which is how Wiki is supposed to work) and for reasons unknown, deleted my link. Whoever excised the Conant devil then found a way to put a spam-block on the link so that the bad Conant bogeyman couldn't be reinserted.

I wrote a note to Wiki founder Ward Cunningham asking what was up. Hopefully he'll reply.
My take is that political interference is the real cause. Those who reach my Monty page might continue to read other of my pages that have raised hackles. Also, if you don't like what I have to say, you may be tempted to try to undercut whatever credibility I might have. That would include trying to cordon me off as a nobody without competence to criticize the government, or others, on technical matters.

The Wiki nix comes in the context of many other such incidents.

Monty Hall over easy
http://www.angelfire.com/trek/nfold/monty.html

BTW, that page adds what I consider an interesting sidelight from mathematical information theory.

Friday, July 21, 2006

Ellsberg scorns official 9/11 probes

Daniel Ellsberg, the man who exposed government deception about the Vietnam War by leaking the "Pentagon Papers," says that "very serious questions" concerning possible government complicity in the 9/11 attacks require a new "hard-hitting investigation of a kind we've not seen."

Ellsberg, an intellectual once employed by the RAND Corp. think tank, said that the Bush administration was "capable, humanly and psychologically, of engineering such a provocation."

Ellsberg detonated a national firestorm by leaking the "Pentagon Papers" to the press. Those secret documents showed a pattern of official deception concerning the Vietnam war.

Ellsberg told an interviewer that though he found much of the inside job theorizing "very implausible," other criticisms are "quite solid, and there's no question in my mind that there's enough evidence there to justify a very comprehensive and hard-hitting investigation of a kind that we've not seen, with subpoenas, general questioning of people, and raising the release of a lot of documents."

Ellsberg continued that "there's no question" that "very serious questions have been raised about how much they [government officials] knew beforehand and how much involvement there may have been."

Ellsberg, who worked as an analyst in the Johnson administration, said he was familiar with the use of provocations, noting that the Gulf of Tonkin incident, which was used as a pretext for hostilities, potentially could have resulted in numerous American casualties.

Ellsberg warned that another 9/11-type attack could result in a "Reichstag fire" decree that ends liberty altogether.

These comments come from a transcript of an interview with Ellsberg in Infowars. The Infowars article is dated July 19, 2006.
The url is

http://www.infowars.com/articles/terrror/pentagon_papers_author_gov_maybe_did_911.htm

Ellsberg scorns official 9/11 probes

Daniel Ellsberg, the man who exposed government deception about the Vietnam War by leaking the "Pentagon Papers," says that "very serious questions" concerning possible government complicity in the 9/11 attacks require a new "hard-hitting investigation of a kind we've not seen."

Ellsberg, an intellectual once employed by the RAND Corp. think tank, said that the Bush administration was "capable, humanly and psychologically, of engineering such a provocation."

Ellsberg detonated a national firestorm by leaking the "Pentagon Papers" to the press. Those secret documents showed a pattern of official deception concerning the Vietnam war.

Ellsberg told an interviewer that though he found much of the inside job theorizing "very implausible," other criticisms are "quite solid, and there's no question in my mind that there's enough evidence there to justify a very comprehensive and hard-hitting investigation of a kind that we've not seen, with subpoenas, general questioning of people, and raising the release of a lot of documents."

Ellsberg continued that "there's no question" that "very serious questions have been raised about how much they [government officials] knew beforehand and how much involvement there may have been."

Ellsberg, who worked as an analyst in the Johnson administration, said he was familiar with the use of provocations, noting that the Gulf of Tonkin incident, which was used as a pretext for hostilities, potentially could have resulted in numerous American casualties.

Ellsberg warned that another 9/11-type attack could result in a "Reichstag fire" decree that ends liberty altogether.

These comments come from a transcript of an interview with Ellsberg in Infowars. The Infowars article is dated July 19, 2006.
The url is

http://www.infowars.com/articles/terrror/pentagon_papers_author_gov_maybe_did_911.htm

Ellsberg scorns official 9/11 probes

Daniel Ellsberg, the man who exposed government deception about the Vietnam War by leaking the "Pentagon Papers," says that "very serious questions" concerning possible government complicity in the 9/11 attacks requires a new "hard-hitting investigation of a kind we've not seen."

Ellsberg, an intellectual once employed by the RAND Corp. think tank, said that the Bush administration was "capable, humanly and psychologically, of engineering such a provocation."

Ellsberg detonated a national firestorm by leaking the "Pentagon Papers" to the press. Those secret documents showed a pattern of official deception concerning the Vietnam war.

Ellsberg told an interviewer that though he found much of the inside job theorizing "very implausible," other criticisms are "quite solid, and there's no question in my mind that there's enough evidence there to justify a very comprehensive and hard-hitting investigation of a kind that we've not seen, with subpoenas, general questioning of people, and raising the release of a lot of documents."

Ellsberg continued that "there's no question" that "very serious questions have been raised about how much they [government officials] knew beforehand and how much involvement there may have been."

Ellsberg, who worked as an analyst in the Johnson administration, said he was familiar with the use of provocations, noting that the Gulf of Tonkin incident, which was used as a pretext for hostilities, potentially could have resulted in numerous American casualties.

Ellsberg warned that another 9/11-type attack could result in a "Reichstag fire" decree that ends liberty altogether.

These comments come from a transcript of an interview with Ellsberg in Infowars. The Infowars article is dated July 19, 2006.
The url is

http://www.infowars.com/articles/terrror/pentagon_papers_author_gov_maybe_did_911.htm

Thursday, July 20, 2006

"Ellsberg joins 9/11 doubters"

That was a headline I saw on an excerpt from a Free Market News Network story. Somehow the story vanished and further attempts to find it via various search engines failed.
I also tried searching the Infowars site, cited in the excerpt, with relevant keywords, but nada.

Clearly if Daniel Ellsberg, of Pentagon Papers fame, has clearly expressed skepticism about the government's theory of 9/11, that would be very important, considering that he has been cautious on that point in the past, it appears. However, he has previously expressed doubt about the Bush group's use of innuendo to tie Saddam to 9/11 and has suggested that John Ashcroft, the former attorney general, possibly warranted a jail sentence for his behavior in squelching a probe of FBI translator Sybil Edmonds' concerns.

A clearer exposition of his views on the government's 9/11 claims is in order.

Tuesday, July 18, 2006

'Random route' encryption may beat NSA taps

Even before the internet, engineers found that random routing was a good method of making telephone connections.

Randomnized routing decentralizes automated decision-making at relay points and tends to produce usable results. Otherwise, the best route, which varies from millisecond to millisecond, would require a "polynomial" solution to the traveling salesman problem or a futuristic quantum computer able to juggle enormous numbers.

But randomness is the tool of the cryptographer. And the National Security Agency is well aware of the problem of random routing, as James Bamford reported in his book about the NSA, Body of Secrets (Doubleday 2001). His unnamed spooks worry about digital voice signals being hard to track, with perhaps one talker's voice traveling by landline and the other's voice routed via satellite.

At any rate, it may be possible to refine the routing problem and effectively encrypt an internet message thus:

Perhaps using public key encryption to share private protocols, an algorithm arbitrarily chooses a binary digit string length [just to be fussy, we distinguish between BD length and bit, or quantity of information, found in a BD string]. Each string is given a tag (prefix or suffix) that gives time reaching the encoder and the final address. Then each string is sent independently to that address. The decoder reads the times and compiles the strings in the correct order, reconstituting the message.

Note that an eavesdropper would need to be very close (circuit-wise) to the transmitter in order to intercept the entire message. Otherwise, an eavesdropper, even if he had the keys, would very likely pick up only disjointed fragments of the message, since the probability of two fragments taking the same route is exponentially low. Of course, the eavesdropper program, on spotting something suspicious, might go to the address and intercept all incoming traffic -- but, it might be too late to get much of it.

String length can be a constant or vary, using random number or some deterministic function.

And, it would normally be a good idea to use some other encryption system on the entire message before transmitting the fragments.

But, as long as the receiver has the ability to order the fragments, a transmitter sending short fragments effectively scrambles the message in transit, making tapping highly problematic. So, even if Congress tries to ensure that only tappable-message technology is used on the internet, here we have, potentially, an easy way around such restrictions.

Now such a system might be used for internet voice phone and live video. However, because of the problem of lag times for each snippet, the transmitter would need to send out quite a few copies of each snippet, with redundant arrivals discarded. This procedure would tend to compile the message quite rapidly, though there is a small increase in the possibility of meaningful interception. Still, there might be an annoying, but possibly tolerable overall message lag time, such as one experiences with satellite cell phones.

Additionally, compression techniques would be helpful if security was more of an issue than quality of reception. That is, one can eliminate certain harmonics from a telephone transmission and still have it "close enough" to the source voice (something I suspect cell phone firms already do). Similarly, information can be systematically discarded from images without unacceptable degradation (as in the lossy JEPG system). Hence, sending multiple copies of fragments may make sense.

Such a system would, if efficacious, sidestep the NSA's current warrantless wiretap program.

And even if Congress imposes restrictions on what type of encryption commercial ventures may make available, two people with a relatively simple software package can potentially employ strong anti-wiretap encryption.

However, I realize that some unforeseen problem might work against this idea.

Sunday, July 16, 2006

9/11 panel clairvoyance worthy of 'X Files'

OK, what's the non-conspiracy theory for this fact: The Kean panel issued its final report a year before the NIST finished its analysis of the collapse of the twin towers.
Obviously, the Kean panel was very confident the NIST would find no evidence of explosives long before the scientific analysis was done.
Additionally, the NIST probe of the collapse of WTC7, which probers have publicly called a "low probability" event, still hasn't been completed. So, if we're looking for a non-conspiracy theory, then the Kean panel has magical abilities to read the future.
Plus, assuming investigative objectivity, what if probers were to find find that explosives were the probable cause of the WTC7 collapse? Wouldn't that cast doubt on the official theory about the twin towers? Hence, we can be sure that the probers already know what they are supposed to determine.
That is, the Kean panel's bizarre psychic abilities look like a case for the "X Files."
**********************
Since I began this blog, the hyperlink function on my Yahoo accounts has been disabled, at least for some, or perhaps most, receivers. The political motivation is clear. Every hindrance to my Znewz1 system cuts down on the number of people reading the blog. [The function had been messed with before that, but there is a new attack.]
Perhaps the behind-the-scenes justification given is the latest manufactured Mideast "crisis."

Saturday, July 15, 2006

Concealment of signals

Gee, I happened to look at Wikipedia's article on information theory and, under applications, found that concealment of a signal under the noise floor is used by GPS. Additionally, one uncovers the signal using a secret pseudorandom function.
So, though I didn't check my idea (below) with an expert, it appears that I was on the right track. Again, I note that the presence of a pseudorandom function may be detected with statistical analysis.
Also the Wiki piece did not specify that the GPS system's pseudorandom function aimed to make noise power roughly equal to signal power, thus making the detectable signal about zero.

Thursday, July 13, 2006

U.S. Communists soft on 9/11 treason

We see a lot of implied haw-haw's showing up in the media these days about 9/11 "conspiracy theorists."

Reminds me of back when the New York Times was outraged that the government would LIE like crazy -- imagine that! -- about Vietnam, as demonstrated by the Pentagon papers; but even so, the Times was ABSOLUTELY CERTAIN that the government was telling the God's-honest-truth about the JFK murder.

Yet, the Times did an extensive investigation of the murder -- supposedly in order to rebut Warren commission critics -- but, upon examining their material, decided not to publish a word of it. And no one knows what happened to all those notes.

What I notice about those in the press using the "conspiracy theory" sobriquet is that rarely, extremely rarely, have they ever moved a muscle to do any legwork on 9/11. Those news organizations that are the testiest about those who question the official theory have invested little energy in investigating 9/11 -- and that's an especial disgrace for a slew of major New York-based outfits.

Of course those who are most likely to dump on 9/11 doubters are generally those who are well-known for their strong neocon leanings, such as the New York Post and the Wall Street Journal's editorial page. Yet, the neocon group in the media could not succeed so well without a lot of help. That would be the communist conspiracy.

We pooh-pooh such a notion these days. But China is still red, no matter what some claim and the Chinese are having an increasing influence in America's corporate boardrooms.
And Putin is using KGB-strongarm tactics on Russia -- which, like Israel, had a lot to gain from the 9/11 attacks.

I just checked the Communist Party USA's web sites, and, as I suspected, the party's policy has been to toss a few limp-wristed pebbles at the official line on 9/11. The People's Daily World articles mostly mimicked what could be found in mainstream media, though with appropriate leftist remarks. What little the party's Political Affairs magazine had to say was hardly enough to rev up the red rank-and-file to make an issue of the government's 9/11 pack of lies. Though the magazine noted that the Kean Commission had "failed to connect the dots, the reds published nothing that might seriously undermine the federal story.

We should expect that the reds will have penetrated the 9/11 truth movement, hoping to neutralize it while playing ardent advocates.

Nothing to hide? 9/11 lid stays clamped

9/11 families are incensed that federal aviation records concerning that fateful day remain sealed, despite what they thought was a court victory.
Plainly, the feds just don't want that info out there for all the "paranoid conspiracy theorists" to begin connecting more dots that point to an "inside job."
Google "Staten Island Advance" for the story of the 9/11 lid.

Wednesday, July 12, 2006

Tampering with Yahoo email

Well speaking of a need for encryption, I find that hackers have once again tampered with my Yahoo email accounts.
This time all copies of a Znewz1 circular I sent out concerning the action of religious ministries in challenging NSA warrantless wiretaps have been deleted from two of my Yahoo accounts.
That circular, based on a handout, preceded the Forward's article by about a week.

Monday, July 10, 2006

Network analysis could work against CIA

A recent article on terrorism in Discover includes a computer screen graphic which shows a purported "network analysis" with bin Laden as the hub of a spider's web dense with spokes.
However databases, some available commercially, might be used in connection with network analysis algorithms to spotlight numerous webs that would point to the likehihood of covert CIA operations. Or, secret activities of any intelligence agency.
You can bet your boots that every intelligence agency in the world is using such data mining techniques not only against potential terrorists, but against each other.
Now here's a thought: why shouldn't independent investigators use network analysis to throw a spotlight on the traitors who choreographed the 9/11 attacks and implemented coverups.
Really sharp scientists should be able to begin with enough input information to come up with some invaluable algorithmic outputs. Of course, showing up in such a network doesn't establish guilt. It's merely a method of giving probers people of interest that should be checked further.
For example, though Hatfield has not been convicted in the anthrax attacks, a network analysis run on him and others with the requisite backgrounds might give independent probers more places to poke around.

Thursday, July 06, 2006

Making waves, making static

Suppose you are concerned about the physical detection of your electromagnetic transmission, regardless of code-cracking concerns. You would want to make your signal as much like random noise as possible.
To do this, you would use the formula
signal = (noise power - signal power)^1/2
and then make sure your noise power roughly equals your signal power so there is no detectable signal.
But you must use a deterministic algorithm that is good at approximating noise and that, added to typical noise over time, nearly equals signal power. The decoder computer then uses the algorithm's inverse to subtract out the pseudorandom noise.
A disadvantage is that pseudorandom functions may be revealed by statistical analysis. Also, monitors may be suspicious of the amplitude of the noise and may detect it as coming in bursts, indicating message traffic.
Such a system might prove useful in masking keyboard transmissions. Keyboards, which typically (though not by design) give off recognizable electromagnetic waves for each key, are vulnerable to remote eavesdropping equipment.
I have not checked with any expert to see whether this idea is practical.

Ummm

Regarding the previous post: actually, I am not certain of current legal issues concerning encryption. Still, it makes sense in some cases for individuals and groups to come up with their own encryption algorithms rather than to obtain a software program that may have been compromised.

More on cryptograms

Of course, the Data Encryption Standard and its replacement, the Advanced Encryption Standard, use forms of block coding more complex than the example I gave. However, they do not employ dummy bit strings.
The AES is reportedly secure enough for even U.S. government classified matter. But, there may be a problem in who gets to use it. These programs are licensed by the feds, who obviously require that keys be shared with them. Additionally, there are export controls on advanced encryption systems based on number of bits in the block.
One may of course download a free encryption system such as PGP. The problem is that the web site may be compromised and the received program contain a hidden backdoor for secret monitors.
This is why it may make sense to design one's own encryption system, using ideas such as those I have sketched.
In particular, the first cipher I posted is impervious to any useful statistical analysis provided the key is changed at proper intervals.

'Noisy' ciphers

Suppose for some reason the previously posted encryption method isn't desirable. Well here's another scheme that tends to defeat frequency analysis, the tool that professionals use to defeat most amateur ciphers.
Use block scrambling plus fake noise.
Pick a number between say 15 and 20 and randomly select the order of the numbers. You might use a random number generator for this purpose.
The alphabet for example is normally numbered a=1, b=2 and so forth. Require your program to re-order all letters in blocks of 15, or whatever, in accord with the order you've given. A cryptanalyst must try to figure out not only which block length you've used but also which ordering. For 15, there are 15! permutations, a very high number.
Still he/she can still apply frequency analysis and, for a large enough sample, have no trouble cracking the cipher. However, you now introduce plaintext gibberish words that contain high frequencies of x's and z's and other normally low-frequency characters. Your program randomly inserts such words into the text after randomly compiling character strings of a few lengths, drawing on mostly low frequency characters with a few higher frequency ones sprinkled in.
The decoder program then matches all words against an agreed dictionary and those that are gibberish are deleted.
A better idea is not to bother with plaintext words but to insert a set of digit strings that the decoder will read as junk codewords. That is, quite often information is sent with only a subset of the possible binary digit strings of some specified length. We then take a subset of unused digit strings of various lengths and randomly intersperse them into the encoded data stream with relatively high frequencies. The decoder program automatically rejects these strings but a cryptanalyst will have a tough time with them.
Obviously, such a system must still be unambiguous, so that a noise string can't be misread as part of a text string by the deciphering program. A carefully modified Huffman method would work.

The silence of major Jewish groups

The activism of the Reform and Conservative Jewish syanagogue movements in defense of traditional American freedoms is in sharp contrast to the silence of secular Jewish rights watchdogs whose main focus these days is defense of the state of Israel, writes Marc Perelman in the Forward, a Jewish newspaper.

http://www.forward.com/articles/8070>

While Reform and Conservative activists have publicly questioned the Bush administration's policy of warrantless NSA wiretapping, the American Jewish Committee, the American Jewish Congress and the Anti-Defamation League have all avoided taking a stance on the matter, Perelman says.
The feeling is that these lobbies, which have traditionally sought to defend Jewish interests and rights, are out of touch with grassroots American Judaism, their boards being dominated by wealthy hawks, Perelman says. Congress tends to take the opinion of the secular groups as representative of Jewish opinion, the writer says.
Perelman said nothing about Orthodox groups, which apparently have also taken no position on the Bush White House's major expansion of presidential powers in the name of fighting terrorism.

Wednesday, July 05, 2006

A call for state probes of 9/11

Clearly the feds are incapable of an honest probe of the events of 9/11. The Kean panel's report is essentially a lame whitewash of the "failure-to-connect-the-dots" cover story propounded by high-level federal officials.
However, states have a large degree of sovereignty and are fully capable of launching their own, independent investigations. As elections loom, candidates should be urged to call for such inquiries.
It is disgraceful that the states of New York, New Jersey, Pennsylvania and Connecticut have simply bowed to the feds when it is impossible that their law enforcement agencies are unaware that their residents died as a result of federal treachery. Yet, any state is able to conduct such an inquiry, and here's hoping many states make such efforts.
State elective officials are not low-level underlings of federal officials. They have a right and a duty to contradict federal claims when those claims are highly injurious to the American democracy.
So let's get cracking.
Plus, we need a private commission, hopefully led by a distinguished but no-nonsense scientist, to also look into the matter of war crimes against the American people and other peoples by an underhanded use of federal security agencies.
Does this call sound far-fetched merely because you won't read about it soon in the New York Times? Remember the movement among states to permit citizens to carry concealed weapons? That wasn't noted by the Establishment media until some 20 states had adopted such laws.
And that movement is not the only one in recent years to gain clout despite attempts at manipulation through national media silence.

A hard-to-beat cipher?

We have the set of integers from, say, 0 to 100 million -- call it S -- as our primary source of cipher numbers. Supposing we have 100 plaintext characters we wish to represent, we intend to establish 100 subsets of S at 1 million integers per set. We then randomly draw numbers from S and assign them to each subset. That is, each subset will have 1 million integers chosen randomly from S.
We now assign a plaintext character to each subset, meaning that any element of the subset represents the character.
For example, the letter "e" would be represented by any one of a million numbers.
When a message is sent, a program randomly, or pseudorandomly, selects a number from the set assigned to a character and then discards it for the remainder of the transmission. That is, "e" would be represented by a different integer for every occurrence. But the decipherer program would check each number against the subsets and would know which character was intended.
What this means is that an enciphered message would contain no repeating numbers.
Hence, frequency analysis would fail. It's impossible to determine the probability that some number represents say an "e" because although "e" shows up often in the message, "e" is represented by a set of unique numbers, each of which shows up only once.
If an identical message was sent twice, it is highly improbable that its numerical representation would recur. For example the probability of "hello" being represented by the same digit string twice is about 10^(-30).
Even if a particular message was decoded without use of the key, there is only a very low chance of a subsequent message being decoded.
No program -- not even one of the proposed quantum computer designs -- is likely to be able to crack that cipher given a sample of a few messages, though, given enough messages, that possibility exists. However, it's possible to either randomly rebuild the subsets after some interval or to begin with a number much higher than 10^8.
Advantage of the program is that it doesn't need to hide frequency in a smokescreen of false noise, thus making it fairly compact for an enciphered program. Disadvantage is that it requires that receiver and sender have the key. Yet, the program could be used in concert with a public key encryption system.
A reason for not using public key encryption only is because of the efficiency issue. Some might eschew the public key method altogether because sometimes the primes are low enough for supercomputers to crack, or because there is no telling how far advanced quantum computers have come in classified research.
Enciphered programs usually tend to be less compact that unenciphered ones simply because most of the numbers used require more binary digits than the corresponding ASCII character requires (though ASCII is not highly efficient).
I realize this stuff is old hat. But suppose you have a legitimate need to communicate privately and don't trust commercial encryption programs because of the possibility of a backdoor key given to the feds? Now you have something simple that should work quite well.

Saturday, July 01, 2006

Steve Nass: Running interference for traitors

Wisconsin state legislator Steve Nass seems like a regular guy. The onetime payroll expert is a veteran of Operations Desert Storm and Desert Shield. He's a member of the Air National Guard, the American Legion, the Veterans of Foreign Wars, and the Kiwanis.

The Republican got into the news recently by calling for the ouster of a University of Wisconsin instructor who is convinced that feds engineered the 9/11 attacks. Of course, the instructor, Kevin Barrett, is an easy target. He's a Muslim who teaches about religion.

Yet the fact that a Muslim expresses doubt about 9/11 does not somehow absolve the Bush bunch.

People like Nass do a grave disservice to America by using their positions to promote the coverup of treason. Now one cannot assume that Nass has consciously decided to side with treason. Clearly he's following the party line, and the party line is that "none dare call it treason." Rather than being an out-and-out traitor, Nass more likely falls under the category -- in the jargon of intelligence operatives -- of "useful idiot" (aka "pawn" or "dupe").

It seems quite likely that Nass, who finds Barrett's charges "outlandish," is truly unaware of the treason of 9/11 because he assumes it can't be so and hence has done no spadework to find out what really might be going on in Washington. He's reminiscent of those gullible Americans who once assumed that the communist conspiracy couldn't be that bad merely because the press didn't have much on it.