[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Clips June 11, 2002



Clips June 11, 2002

ARTICLES

U.S. Should Control Internet Body, Senator Says
House puts terrorism information sharing bill on fast track
New law takes aim at ID theft
Rust Belt States Try High-Tech
Terra Lycos Unveils New Online Music Service
White House Stressing Unorthodox in IT Security Fight
Internet Cafes Face Crackdown
Israeli Device Detects Cell Phones Acting as Bugs
Powerline networks your home
Net the newest place to hawk stolen goods
Long-lost password discovered
Proposed agency will focus more attention on cybersecurity
IBM reports storage breakthrough
Flaw reported in IE browser
Webcasters win one battle in war over internet radio
New domain name regime to go live on July 1
Boom to bust for domain name claims
Govt gives free software a run
Russians create 3D power map
Opening the Open Source debate - II
Browsing Around for New Targets
New E-Waste Solution a Mine Idea
'Grid' Computers to Simulate Terror Scenarios
IBM To Announce Chip Technology That Drains Less Power
China, Japan Signed Memorandum on Internet Technology Cooperation
Israeli companies not adequatedly prepared for disaster
Mass. roundtable panelists talk up broadband, Web security
Clarke: Homeland security revamp to help cybersecurity
EU: MS Passport is under investigation
Blind gain free access to 53 newspapers online

******************
Reuters
U.S. Should Control Internet Body, Senator Says
By Andy Sullivan

WASHINGTON (Reuters) - A U.S. senator said he would try to rein in the group that oversees the Internet's traffic system, calling for a more direct U.S. government role in the ostensibly international and independent body.


Sen. Conrad Burns, a Montana Republican, on Monday said he likely would introduce a bill to require the Internet Corporation for Assigned Names and Numbers, also known by its acronym ICANN ( news - web sites), to give the U.S. government more influence in managing the domain-name system. The system lets Internet users navigate the Web with easy-to-remember names like " www.example.com."


If ICANN failed to cooperate, Burns said, it could be stripped of its authority when its contract comes up for renewal this fall.

In a statement released two days before a Senate subcommittee is scheduled to hold hearings on the global body, Burns said the change was necessary because ICANN has exceeded its authority, does not operate in an open fashion, and is dangerously unaccountable to Internet users, businesses and other key interest groups.

"The U.S. needs to ensure ICANN operates with the same sort of internal processes as in any other federal agency," he said.

ICANN is not a part of the U.S. government, but operates under a 1998 contract with the Department of Commerce that requires it to meet several conditions before it assumes full control of the domain-name system.

But it is uncertain whether ICANN will ever meet those conditions and obtain full control, according to a congressional investigator's prepared testimony obtained by Reuters.

Criticism of ICANN comes from many quarters, not just U.S. politicians. Dozens of other governments have charged ICANN with being too dominated by U.S. interests, while the domain-name industry and grass-roots "cybercitizens" have raised their own grievances.

PROGRESS SLOW ON MANY FRONTS

ICANN has introduced competition into the domain-name business, driving down Web-site registration prices from around $50 to $10, according to testimony prepared by Peter Guerrero, a director of the General Accounting Office ( news - web sites), the investigative arm of Congress.

But it has made little progress toward fulfilling other requirements such as boosting security of the domain-name system and setting up an inclusive governance process, Guerrero said.

ICANN has made little progress in formalizing relationships with other volunteer groups that oversee aspects of the domain-name system, he said.

"Until these issues are resolved, the timing and eventual outcome of the transition effort remain highly uncertain, and ICANN's legitimacy and effectiveness as the private-sector manager of the domain-name system remain in question," Guerrero said in his written testimony.

Guerrero also took the Department of Commerce to task for what he called its informal, hands-off approach to ICANN oversight, and recommended that Commerce issue periodic progress reports.

ICANN President M. Stuart Lynn declined to comment on Burns' proposal, saying he would wait until the bill was actually introduced.

"We respect Sen. Burns' interest in ICANN, and we'll have to see what evolves," said Lynn, who is scheduled to testify on Wednesday before the Senate science, technology and space subcommittee, on which Burns sits.

Lynn said Commerce exercises considerable oversight, noting that the group cannot add new "top-level" domains to join the likes of ".com" without the federal agency's approval.

A Commerce official declined to comment ahead of the hearing, when Assistant Secretary Nancy Victory is also scheduled to testify.

ICANN has struggled over the past several years to figure out exactly how it should function and who should participate. After direct elections open to any Internet user filled five of 19 board seats, the board of directors in March ruled out further elections.

An official reform committee released its blueprint last week, which will be considered when ICANN next meets at the end of the month in Romania.
*******************
Government Computer News
House puts terrorism information sharing bill on fast track
By Wilson P. Dizard III


The Bush administration and leaders of both parties in the House are working with the Judiciary Committee to rush through a bill that would require the CIA, the FBI and other federal intelligence agencies to share information with state and local police.

The Judiciary Subcommittee Committee on Crime, Terrorism and Homeland Security approved the Chambliss-Harman Homeland Security Information Sharing Act June 4, two days before President Bush unveiled his plans for a Homeland Security Department. The full committee is set to mark up HR 4598 at 10 a.m. Thursday.

The bill would eliminate stovepipes that prevent information sharing among state and federal agencies charged with fighting terrorism, its House sponsors said.

Rep. James Sensenbrenner (R-Wis.), Judiciary chairman; Rep. Nancy Pelosi, (R-Calif.), the Democratic whip; and 28 other lawmakers have endorsed the bill. Rep. Saxby Chambliss (R-Ga.) co-authored the legislation with Rep. Jane Harman, (D-Calif.). Neither of the bill's authors serves on the subcommittee, but both are members of the House Permanent Select Committee on Intelligence's Subcommittee on Terrorism and Homeland Security.

The bill also would direct the attorney general and the CIA director to develop procedures for information sharing via existing networks, such as the National Law Enforcement Telecommunications System, after stripping intelligence data about sources and methods. Further, it would increase the number of security clearance investigations at the state and local levels to ease concern about distributing classified information.

The bill would mandate the use of declassification methods similar to those now used to share intelligence with NATO and Interpol members.

"While we have enhanced the capabilities of the federal, state and local officials to prepare and respond [to terrorism], as a nation, we still lack a coherent, effective and efficient way to share sensitive intelligence and law enforcement information among those who need to know," Chambliss said at the subcommittee hearing. "Our police officers, firefighters, sheriff's offices, medical personnel and elected officials must be informed of threats that exist in their communities so that they are able to protect the citizens in their own towns."

The Office of Homeland Security and the CIA helped draft the bill.
*******************
San Francisco Gate
New law takes aim at ID theft

Businesses nationwide are scrambling to comply with a new California law that sharply limits the use of Social Security numbers to identify customers.

The law, which takes effect July 1 for most companies, is designed to thwart identity theft. Another provision of the law makes it easier for consumers to alert credit bureaus when they think their identity is at risk.

Beginning Jan. 1, a third provision will let consumers temporarily freeze all access to their credit reports.

Under SB168, companies cannot:

-- Post or display SSNs.

-- Print them on identification cards or badges.

-- Print an SSN on anything mailed to a customer unless it's required by law or the document is a form or application.

-- Require people to transmit an SSN over the Internet (through e-mail, for example) unless the connection is secure or the number is encrypted.

-- Require people to log on to a Web site using an SSN without a password.

Although the SSN restrictions apply only to accounts opened in California after June 30, many companies are adopting the new rules for all customers -- old and new, in every state.

Companies say it's simpler to have a single system. Also, companies that continue using SSNs for old accounts must give customers an annual opportunity to opt out. Finally, it's only a matter of time before other states -- and possibly the federal government -- follow suit. Four states have bills pending that would restrict the use of SSNs.

The California law has several exceptions:

-- Health care and insurance companies have until Jan. 1 to comply with the new rules (except the one prohibiting SSNs on ID cards) for individual policy holders. After Jan. 1, 2004, they must comply with all the SSN restrictions for new individual policy holders and new group polices. By July 1, 2005, they must comply with all restrictions for all existing policies.

-- The law does not apply if a state or federal law requires an SSN on a document. That means SSNs will still appear on federal and state tax forms and employee pay stubs.

-- Private schools and colleges have to comply, but public ones don't. However, the University of California and the California State University systems are already moving away from using SSNs as student ID numbers.

-- Companies may still use SSNs to identify customers internally.

The American Benefits Council, which represents large employers that sponsor health and pension plans, recently held a conference call to educate members about the new law.

"It had the biggest response to any call we've ever had," says John Scott, the council's director of retirement policy. "This is a big issue. The use of SSNs is pervasive in retirement plan administration. To modify your systems to comply with this law is a huge effort."

J.P. Morgan/American Century Retirement Plan Services began preparing for the law in November and hopes to finish testing its new system this week.

It will continue using SSNs to identify all customers internally, but not externally, using a product known as an invisible font.

"The printer prepares an index for us on CD-ROM that's sorted by Social Security number. No Social Security number will show up on the statement sent to individuals. But if someone calls in with a question, we can look up the account on the CD-ROM," says Robert Holcomb, a vice president with the firm.

Scott says other companies are using truncated or encrypted SSNs. He says at least one is embedding customer SSNs in the middle of a bar code that will appear on customer statements.

Many small companies "are only now trying to figure out how to comply," Scott says.

He says there could be some mistakes as companies switch to a new system, "especially at the rollout. There's also the possibility of people being overly cautious and not sending all the information that should be sent out."

Companies that fail to comply with SB168 are subject to general unfair- business-practice penalties.


SECURITY ALERTS
SB168, sponsored by Sen. Debra Bowen, D-Marina del Rey, also establishes uniform requirements when consumers post a security alert to their credit file.


A security alert warns lenders that a person may be a victim of identity theft.

This provision, which also takes effect July 1, requires the three major credit bureaus to provide a provide a toll-free telephone number for people to place security alerts and to place the alert in a consumer's credit file within 72 hours of receiving it.

The alert must remain in place for at least 90 days, and credit bureaus must offer consumers a free credit report at the end of 90 days.

The toll-free numbers for the credit bureaus are Experian, (888) 397-3742; Equifax, (800) 685-1111; and Trans Union, (800) 888-4213.

A security alert is advisory only. It encourages but does not require lenders to take extra precautionary measures to verify a person's identity.

Sen. Bowen has sponsored a new bill, SB1730, that would force businesses that access a credit report to take certain steps to communicate with a person who placed a security alert to make sure he or she is not an identity thief. That bill passed the Assembly Banking Committee 8-3 Monday and now moves to the Assembly judiciary committee. It passed the Senate 24-11.


CREDIT REPORT FREEZE
Beginning Jan. 1, SB168 will let consumers freeze access to their credit reports.


The freeze prevents thieves from getting credit in your name, even if they have your SSN, address, birth date and other identifying information. It also prevents unwanted companies from prying into your credit report.

But it also could prevent you from getting a loan because most lenders won't extend credit without a credit report.

If you're going to be shopping for a loan and you have a freeze on, you can allow your credit report to be released during a specific period of time, to specific lenders or to lenders who have a secret code you give them.

It could take up to three business days to have a freeze lifted, so if you get a weekend whim to buy a Mini Cooper, you could be out of luck.

To read SB168 or SB1730, go to democrats.sen.ca.gov/senator/bowen/ and click on Legislation.
*******************
Associated Press
Rust Belt States Try High-Tech
Mon Jun 10,10:03 PM ET
By BEN DOBBIN, Associated Press Writer


ROCHESTER, N.Y. (AP) - Early in his career at Bell Labs in New Jersey, Wayne Knox tethered the speed of light, determining how to make a laser beam flash on and off in eight quadrillionths of a second.

That achievement helped speed digital communications, and earned Knox mention in the Guinness Book of Records.

Lured back to his hometown and alma mater a year ago, Knox is now viewed as a brainy beacon who could help play a vital role in widening New York's slice of America's high-tech pie. Already, the University of Rochester professor has hired three topflight researchers in biomedical and communications optics.

State capitols from Albany to Lansing, Mich., duking it out to build the next money-spinning research hub, are actively recruiting star scientists like Knox as they lay down heavy bets on optoelectronics, bioinformatics and other odd-sounding hybrid industries they hope will be their ticket to success in the 21st century.

New York alone is investing a record $250 million this year to build four "Centers of Excellence" geared to link up top talent in industry and academia and serve as magnets for investment.

Some regard the strategy as crucial in helping struggling cities re-engineer Rust Belt economies at a time when manufacturing is in free fall. For its part, New York lost 40,000-plus manufacturing jobs from 1997 to 2000, more than any state.

Nationwide, state government sponsorship of high-tech research became much more substantial in the early 1980s as global competition forced big companies to dispense with discovery-level research and rely more heavily on academia.

Great Lakes states began putting aside up to $20 million a year to begin the often painful switch from smokestacks to science. But the biggest wave of state investment carried nearly all states with it in the late 1990s.

Using tobacco settlement funds, Michigan is investing $1 billion over 20 years to create a life-science corridor between Detroit and Ann Arbor. Pittsburgh, Atlanta, Indianapolis and Kansas City are trying to carve out niches in the life sciences.

Although slowed by the recession, state ventures have remained vigorous and varied, funded in myriad ways: technology incubators, venture capital programs, tax credits, improved access to higher education.

Some regions look to mimic models like Research Triangle Park in North Carolina, which drew much of its success from commercializing technology developed by academia. Others hope just to keep pace.

Failures are inevitable but "if you don't make the investment, clearly you're not going to go anywhere," said Dan Berglund of the State Science and Technology Institute in Westerville, Ohio. "Even if the research doesn't pan out you're going to have a work force that's better educated, has a higher level of skills and will be able to compete in the global economy."

States that offer a comprehensive package of incentives are likely to come out on top, he said.

Two years ago, the newly created New York State Office of Science, Technology & Academic Research quickly dispensed $102 million to establish 13 academic research centers, eight focusing on biotechnology.

Its Centers of Excellence will involve dozens of universities focusing on regional strengths nanoelectronics in Albany, information technology on Long Island, photonics in Rochester, bioinformatics in Buffalo and draw much of their funding from industry led by titans like IBM and Eastman Kodak.

One linchpin, most agree, is getting universities to move bright ideas to the marketplace. The University of Rochester boosted its technology-transfer revenues from $3 million in 1999 to $40 million last year.

Another is recruitment, where science stars like Knox come in.

Born in 1957, Knox designed his first ultrafast laser at age 17 while working a summer job at the University of Rochester. A year after joining Bell Labs in 1984, he generated the world's shortest laser pulse.

Think of the milestone this way: In just over a second, light can race from Earth to the moon; each time Knox's experimental laser blinked, the light had traveled a mere one-tenth the thickness of a human hair.

The more on-off repetitions in a fiber optic system, the backbone of digital telecommunications networks, the more rapidly information can be transmitted.

In 1997, Knox achieved further distinction. His research team sent 1,021 separate wavelengths of light traveling down a single optical fiber 64 times the number used by most commercial networks today.

"The world will have to prepare for a time when everybody has a high-speed connection," said Knox, who heads the University of Rochester's venerable Institute of Optics, mixing teaching with research with the help of a $1 million state grant.

Knox is a firm believer in the economic power of the broadband revolution.

"Some people think if we all had broadband it could be worth $300 billion or $400 billion a year. This by itself could pull the entire U.S. economy out of recession."
*******************
Associated Press
Danish Publishers in Court Over Links
Mon Jun 10, 7:27 AM ET
By ANICK JESDANUN, AP Internet Writer


Nicolai Lassen considers linking such a fundamental element of the World Wide Web that he sees nothing wrong with creating a service around linking to news articles at more than 3,000 other sites.


Danish publishers, however, equate such linking with stealing and have gone to court to stop it.


The case, scheduled for hearings in Copenhagen later this month, is among the latest to challenge the Web's basic premise of encouraging the free flow of information through linking.

Requiring permission before linking could jeopardize online journals, search engines and other sites that link which is to say, just about every site on the Internet.

If the Web's creators hadn't wanted linking, "they would have called it the World Wide Straight Line," said Avi Adelman, a Web site operator involved in a dispute over linking to The Dallas Morning News.

Most of the court cases and legal threats have been over a form of hypertext-connecting called deep-linking, by which you simply connect users to a specific page rather than a site's home page.

Such disputes reflect "a frustration certain people have with a loss of control" once they post something, said Michael Geist, law professor at the University of Ottawa.

Lassen's Newsbooster service tries to make news stories easier to find by presenting links to items with keywords of a user's choosing. It's much like a search engine, except Newsbooster charges a subscription fee and lets users choose to automatically receive links by e-mail.

"From the home page down to the actual story you want to read can be a very, very long way," said Lassen, Newsbooster's editor-in-chief. "By using a technology such as Newsbooster, you save a lot of time."

The Danish Newspaper Publishers' Association believes Newsbooster should either shut down or negotiate payments.

"We consider it unfair to base your business upon the works of others," said Ebbe Dal, the group's managing director.

Not that opponents of deep-linking always object to it.

Dal thinks its OK for a newspaper to offer a deep link or two accompanying an article, or for search engines to help users navigate.

Belo Corp. likewise prohibits deep-linking to its sites, including the Morning News. But one of its newspapers, the Providence Journal, maintains an online journal that deep links to other sites.

Belo spokesman Scott Baradell was quoted by several news organizations as saying the company isn't against all deep-linking. But he would not offer specifics on why it objects to deep links to Morning News articles on Adelman's non-subscription site, which covers local Dallas affairs. Contacted by The Associated Press, Baradell said he would have no additional comment.

Reasons for opposing linking vary.

In a federal lawsuit, Homestore.com Inc. complains that Bargain Network, by deep linking to Homestore's real estate listings, interferes with its opportunities to sell advertising.

Others, like the Council of Better Business Bureaus, worry that a link deep or otherwise can imply endorsement, even if it reaches nothing more than a page with tips. The organization has persuaded thousands of sites to remove links to its Web pages, citing trademark claims.

But to Web purists, a link is no more than a footnote or a page reference. To ban deep-linking, they say, is to prohibit newspaper readers from going straight to the sports pages because they might miss advertising in the front section.

Beside, linking is a way for sites to boost traffic.

"Historically at least, there has been a tradition that if you put something up on the World Wide Web, it would be a public resource," said Matt Cutts, a software engineer at Google. He said Google removes links when asked, though few sites request it as most want to be found.

Early U.S. court decisions have sided with deep-linking. Exceptions are in cases of framing, where a site tries to make information from other sites appear as its own, and ones involving links to tools that circumvent anti-piracy measures built into commercial software.

"It was one of those issues that people thought was more or less settled," said Jorge Contreras, vice chairman of the Internet Law Group at Hale and Dorr firm. "For whatever reason, these last couple of months, a spate of new disputes have come up."

If they are resolved in favor of plaintiffs opposed to deep-linking, legal experts say that could encourage more lawsuits and more moats going up around certain Web sites.

Several sites, including the Belo papers, Overnite Transportation Co., ACNielsen research firm and KPMG International, ban all or some deep-linking. The International Trademark Association and The Washington Post reserve the right to prohibit it on a case-by-case basis.

The Albuquerque Journal and American City Business Journals have attempted to charge for the right to deep link. Although editors acknowledge they won't take action against casual deep-linkers, they say a handful have been willing to pay $50 in Albuquerque's case.

"There are some companies that would rather pay to get a piece of paper and get that blessing," said Donn Friedman, the Albuquerque paper's assistant managing editor for technology.

Technology exists for sites that truly want to block deep-linking.

For example, the news site for The Associated Press, The WIRE, checks what site a user comes from. If it isn't a site authorized to use deep links, the user is automatically directed to a default page and required to enter through one of the AP's member newspapers or broadcasters.

Other sites can require registration or paid subscriptions.

Though Web site operators don't always like technical blocks, they prefer that to a legal environment where a ban is presumed and permission must be sought each time.

Weldon Johnson of LetsRun.com, involved in a dispute this spring with Runner's World magazine, said that as long as sites keep the doors open, "it's totally wrong for them to say you have to link to certain pages."
*******************
Reuters
Terra Lycos Unveils New Online Music Service
Tue Jun 11,12:07 AM ET
By Reshma Kapadia


NEW YORK (Reuters) - Spanish Internet media company Terra Lycos on Tuesday launched an online radio-style music subscription service that will offer music from four major record labels.


Terra Lycos said it partnered with Listen.com to create the new Web service, Lycos Rhapsody, which will offer access to more than 10,000 albums. The service does not let users download the music or record it on CDs, but plans to offer that capability later.


"This is another step in the evolution of the digital music market," said Gartner2Media Research Director P.J. McNealy.

Online music efforts by record companies have not resonated strongly with Web surfers. After being spoiled by the range of artists and possibilities that online-music swapping service Napster ( news - web sites) offered for free, many Web surfers have been turned off by the price and limits of the services backed by record labels.

Listen.com has struck deals with Bertelsmann AG ( news - web sites)'s BMG, Sony Corp ( news - web sites).'s Sony Music Entertainment, EMI Group Plc ( news - web sites)'s EMI Recorded Music and AOL Time Warner Inc.'s Warner Music Group, as well as many independent labels.

Listen.com Chief Executive Scott Ryan said the company was also in talks with Vivendi Universal's Universal Music unit .

Like many Internet media companies, Terra Lycos is feeling the sting from the sharp downturn in advertising spending. Lycos Rhapsody is the company's latest effort to shift away from ad revenues toward paid services.

Internet media companies "are looking for any add-on service that means more revenues, and every single (Internet media) company is trying to transform their business into an entertainment source, not a data source. Music is the first step," said McNealy.

Listen.com's Ryan said the company would introduce CD-burning capability later this year.

"We are working with Lycos on that," Ryan said. "For right now, there are a substantial amount of people who listen online, and we have to address them."

When the service does offer CD-burning capability, Listen.com will have to renegotiate the licenses with the record labels. Ryan said he was already engaged in such talks.

Terra Lycos will give Web surfers free access to the service until June 30. After that, users will be able to choose one of three tiers of service. The free tier will offer 20 different radio stations with FM-quality sound.

The second tier, which company executives likened to "basic" cable, will let users access more than 50 commercial-free radio stations with CD-quality sound and the ability to skip tracks, at a cost of $5 a month.

The last tier will cost about $10 a month and allow for unlimited music via more than 50 commercial-free radio stations with CD-quality sound and the ability to skip tracks and save them to a library.

"It's building on the base of music services we've offered in the past," said David Pritchard, senior director of music, TV and film for Terra Lycos. "Obviously, we have to keep running and so we want to stay on the leading edge of music services that are being offered online.
********************
Washington Post
White House Stressing Unorthodox in IT Security Fight
Brian Krebs


The Bush administration is playing "dirty" with the private sector in a roundabout attempt to fortify the nation's computer security defenses, the White House's cybersecurity czar said today.

Richard Clarke, the president's special adviser for cyberspace security, said unorthodox approaches may be needed to get the attention of companies that own and operate the infrastructure of the Internet yet do not respond to the administration's self-regulatory, hands-off approach to cybersecurity.

The administration has been talking to insurance firms about the idea of writing cybersecurity insurance for companies, Clarke said, offering an example of one carrot-and-stick approach.

The catch, however, is that the coverage would only be available to companies that meet certain criteria developed by the insurance industry and the private sector.

"Some of what we do may be a little dirty, but we're doing it," Clarke said at the Networked Economy Summit in Reston, Va.

Clarke's office also has been quietly talking about the possibility of fostering a private sector certification program for information technology security companies.

"How do you know - when you hire an IT security company to do a vulnerability assessment - that they know what they're doing?" Clarke asked. "Maybe there should be an outside process to certify those vendors."

The proposals come as the Bush administration is preparing its "national strategy" for protecting the nation's most vital computer networks from cyberattack.

Clarke and his advisers have been conducting town meetings across the country meetings to raise awareness about the issue, and last week his group met with three dozen university officials to discuss their role in protecting the nation's critical infrastructures and to gather input on the administration's plan.

The White House had earlier said it would release its national plan by mid-summer. But Clarke said today the release date would be pushed back to the end of the summer or mid-September.

Administration officials are currently focused on realigning responsibility for cybersecurity within the new proposed Homeland Security Department.

Clarke said the administration hopes the new cabinet-level agency will be the future home of several federal cybersecurity programs, including the FBI's National Infrastructure Protection Center, the Commerce Department's Critical Infrastructure Assurance Office, the Government Services Administration's FedCIRC, and the Defense Department's National Communications System.
********************
Los Angeles Times
Internet Cafes Face Crackdown
Safety: After shooting death, officials in Garden Grove say they'll step up enforcement.
By DANIEL YI
TIMES STAFF WRITER


June 11 2002

After the weekend murder of a 14-year-old boy who had visited a local Internet cafe, Garden Grove officials said Monday they will step up regulation of such businesses and consider further restricting their operation.

Police also said they would patrol the cyber cafes more carefully.

"We will put some teeth into our ordinance," said Garden Grove Mayor Bruce Broadwater, who met with council members Sunday in an emergency session to discuss the issue. "We are going to be very, very tough" on cyber cafes. The cafes, also called PC rooms, have grown in popularity across the country, especially with young people, who play computer games and surf the Internet for hours.

But officials in Garden Grove and elsewhere say the establishments also attract violence.

Early Saturday morning, Edward Fernandez, whom relatives described as an average boy who loved sports, was gunned down near his home after he was apparently followed from I.C.E. Internet Cafe on Brookhurst Street.

In December, a 20-year-old was stabbed to death outside another Garden Grove Internet cafe.

The City Council passed an ordinance in January that restricted hours of operation and put a moratorium on new cyber cafes. The city has about 20.

Cyber cafes must close by 2 a.m., and, among other things, install security cameras to monitor customers, the ordinance says. No minors are allowed during school hours, past 10 p.m. Fridays and Saturdays and past 8 p.m. on other days. The city also has a 10 p.m. curfew for minors.

Police say Edward and some friends were at I.C.E. past curfew time. The cafe's manager and part owner, Quang Nguyen, said the boys had been there earlier in the evening but left before 10 p.m.

He said he does not allow minors after hours and that he told Edward and his friends when they returned shortly before midnight that they had to leave.

Police said Edward and the friends exchanged dirty looks with the passengers in a black car outside I.C.E. The car followed Edward and three friends who had taken a cab.

Witnesses said a passenger from the black car shot Edward several times as he was preparing to pay the cab driver.

Police are still searching for the suspects.

I.C.E. has a security video system, but when detectives asked for the tapes from late Friday, Nguyen was unable to provide them. City officials said I.C.E. will be fined for that violation and for allowing minors in the business past 10 p.m.

Nguyen said Monday the lack of tape was an oversight.

"That was a mistake on our part," he said, saying the cafe uses its security cameras sporadically. "There were a lot of things going on that night."

Among other things, Nguyen said, suspected gang members were loitering in front of his cafe about 11 p.m. He called police, but when they arrived, the young men left. Nguyen said police found no minors in his cafe, which has signs warning youngsters about the time restrictions. More regulation, he said, is not the solution.

"The kid happened to be here," Nguyen said of Edward. "He could have been anywhere else. He could have been coming back from the movies and kids who didn't like him would have followed him."

Broadwater disagreed.

"We don't see this as a fluke anymore," he said referring to the two killings associated with cyber cafes.

The City Council next week will consider further curtailing hours of operation for Internet cafes and creating a task force to study ways to enforce the law.

"We cannot post an officer on every cafe," police Sgt. Scott Hamilton said. "The kids are smart. They know which managers are lax."

Still, Hamilton said his department will increase patrols around cyber cafes.
*******************
Los Angeles Times
Computer Users' Time on Internet Up 13%
Bloomberg News

June 11 2002

The time computer users spent online in the last year rose as the number and length of Internet sessions increased. Time spent on the Internet each month rose almost 13% to 9 hours and 17 minutes from 8 hours and 15 minutes.

The average number of sessions per month in 21 countries increased to 18 sessions from 16 sessions in the year ended April 2002, according to Nielsen/NetRatings.
*********************
New York Times
Israeli Device Detects Cell Phones Acting as Bugs


TEL AVIV (Reuters) - Imagine your company is holding secret talks to buy another firm when your main competitor suddenly snaps it up from under your nose, apparently aware of all the details of the negotiations.

While you instigate a widespread investigation, the culprit could be nothing more sinister than a cell phone ``accidentally'' left in the corner of the room, placed in a plant pot or taped under the boardroom table.

With a slight modification, cell phones become high-quality bugs. An owner can call the phone from anywhere in the world without it emitting a ringing tone while its screen remains blank, apparently turned off.

``The beauty of the cell phone as a bug is that it's an innocent looking and ubiquitous object,'' said Ben Te'eni, co-founder of Netline Communications Technologies, which has developed a device for detecting cell phone communications, especially from cell phones in apparently dormant mode.

``People trust cell phones, but modified and left in idle mode the cell phone can be used as a transmitter for up to a week. If it's connected to a power supply it can provide endless intelligence. Professional bugsweepers will ignore the cell phone frequency since the phones are so common and not suspicious.''

The drawback for cell phones and what enables Netline to catch them out, however, is that they periodically transmit a signal to their base station. With Netline's small Cellular Activity Analyzer (CAA) device left in a boardroom before or during crucial meetings, cell phone activity is detected and recorded with a visual and audio warning emitted.

``I can leave the CAA in the office before important meetings and it will tell me if there's a cell phone in the room,'' Te'eni said. ``I can also leave it in the room overnight or for a number of days (after a meeting) to see if a bug has been left behind.''

INTELLIGENCE BACKGROUND

Like many Israeli high-tech company heads in the telecoms sector, 33-year-old Te'eni and his co-founder Gil Israeli, 34, are graduates of an army intelligence unit. Te'eni was unwilling to elaborate on his army service or Netline's client list.

Having worked for state-owned Israel Aircraft Industries after leaving the army, the pair decided to branch out on their own and set up Netline in 1998.

Their first product was a jamming device which prevents cell phone calls in chosen areas of a building or in the open air, which Te'eni said has been sold to defense agencies of ''blue chip governments'' around the world.

``The jammer can be used by bomb squads or VIP security services to prevent the detonation of bombs by cell phones,'' Te'eni said.

``We have also sold to prisons because top criminals are known to continue their operations or coordinate testimony using smuggled-in cell phones. In Brazil, riots were synchronised in five prisons using cell phones and in Paris a prisoner escape was coordinated using cell phones.''

Te'eni compared the innocent-looking and simple cell phone with the cardboard cutters used by hijackers of the planes used in the September 11 attacks in the United States.

Both have non-lethal and everyday uses that are positive, but can also make life easier for criminals.

``A phone can remotely activate a bomb or be used for tactical communications such as a terrorist act, bank robbery, hostage situation or kidnapping,'' Te'eni said. ``There are so many negative ways for using cell phones which is why the ability to jam them is crucial.''

PASSIVE MARKETING

Describing Netline's marketing as ``passive'' -- ``customers come to us rather than us going to them'' -- Te'eni said much of the firm's sales were from word-of-mouth recommendations.

``There are many security consultants and they know how to find us,'' he said cryptically, adding that Netline had sales last year of $1 million-$2 million.

As for the future, Te'eni said Netline, like many technology firms in the current global slump, was not ``dreaming big dreams'' but looking for steady growth as security officers become more open to questioning long-standing operational methods following the September 11 attacks on the United States.

``We want to find foreign strategic partners for selling our solutions worldwide to defense and espionage agencies. Security people are second-guessing themselves all the time now so the future looks good,'' Te'eni added.
*******************
San Francisco Gate
Powerline networks your home


Imagine that all the hassle associated with computer networking -- adapters, cables and the configuration issues -- suddenly disappeared, and all you had to do to connect multiple PCs to each other and to the Internet was to plug their power cords into an ordinary AC wall outlet.

We haven't reached that happy state yet, not by a long shot. But this spring, the networking industry has taken a long step in that direction, with a new generation of gear that finally makes what's known as "powerline networking" a reality.

Based on a year-old specification called HomePlug, these new products turn the AC wiring already built into the walls of your home or office into a local- area data network. You still need to add an adapter to each computer you want to connect -- HomePlug doesn't, at least for now, work through the same cable and power supply that provide the juice to run the machine.

But you don't need to open the box -- just plug a videocassette-sized external device into a USB or Ethernet port, then into a separate wall jack. And once you've done that, the adapter will translate the data coming from your computer into a signal that travels over the AC wires, using a different frequency than -- and not interfering with -- the ordinary current coursing through the same wire.

This isn't exactly a new idea -- three years ago, I reviewed an earlier generation of products that were supposed to do the same thing. But those were slow and unreliable and soon disappeared from the market, whereas the products I tried this time -- the Instant PowerLine series, from Linksys, a well-known vendor of home and small-office networking gear -- create a speedy and, as far as I've been able to tell, rock-solid connection, even in my 96-year-old home.

And while the early products were based on proprietary technology from small companies, HomePlug is a standard that's been adopted by scores of vendors, including such heavyweights as Cisco Systems, Motorola and Texas Instruments -- not to mention the RadioShack chain. I tested only the Linksys products, which were first to market, but other vendors are also supporting the HomePlug standard, and the trade association behind the spec -- the HomePlug PowerLine Alliance -- certifies products based on it to ensure they work together.

If you have more than one computer at home or in the office, there are many reasons you might want to network them -- to share files and printers, for example. But for most people the case gets compelling only when you have a high-speed Internet connection -- once you've got everything networked, users can enjoy the benefits of broadband from any of the connected machines.

Remember, though, that simply linking your machines to each other, over AC or anything else, doesn't get them on the Internet -- for that you have to have a cable or DSL line and modem, and if you want multiple machines connected it, you need another device called a router. Powerline networking comes into play after you've got all that set up, when you want to provide fast Internet access for other computers that aren't already wired up to the router.

Suppose, for example, that you have a DSL line hooked up to a router in a downstairs den and a PC in the same room plugged into the router. That way, your connection is plenty zippy. The only problem is that your kids keep grabbing the machine away from you, because they're fed up with the slow speed of the dial-up connection they have to use when they're on their own PC in the upstairs bedroom.

What powerline provides is a painless way to give them access to the same high-speed connection you use. All you have to do is plug a powerline Ethernet bridge into the router and into an AC outlet in the den, then connect a USB PowerLine adapter or a second Ethernet bridge to the kids' computer.

(Eventually, Linksys and other vendors will offer routers with a HomePlug powerline connector built in, so you'll need a separate powerline adapter only for the computer you want to connect. Linksys won't have such a combo device until this fall, though, and as far as I know, no other company has released one yet.)


CONSIDERING THE ALTERNATIVES
Granted, there are already several other good ways to create a home or small-office network, but powerline has some advantages over each of them:


-- Compared with the classic approach -- running Ethernet cabling -- the big benefit of powerline is obvious: no new wires required, so no need to rip open floors or walls or to leave cables exposed. And as long as your computer has a USB port or two, you don't even need an Ethernet adapter.

-- Compared with wireless networking, based on the hot Wi-Fi (802.11b) standard, the big difference is range. Wireless LANs normally work within a radius of 100 to 200 feet from their base station or access point, but if there's a lot of obstruction -- from appliances or metal file cabinets, say -- the actual coverage area may be less, and often deteriorates toward the outer limits of the range.

HomePlug powerline gear, by contrast, is supposed to work at distances of up to 1,000 feet. My house is nowhere near big enough to test that claim, but I had no problems creating powerline connections in corners of my home that are beyond the range of my wireless equipment.

Powerline networks are also faster than Wi-Fi -- HomePlug is supposed to move data at 14 Mbps, compared with a maximum of 11 Mbps under the 802.11b standard. You won't notice any difference when surfing the Web because both technologies offer far more bandwidth than even the fastest standard cable or DSL connection. But if you spend a lot of time moving big files around your home network -- say, copying MP3 files from one PC to another -- HomePlug will definitely cut the wait perceptibly.

In addition, powerline networking seems to be free of the security concerns that have cast a shadow over the Wi-Fi world. Passers-by outside your home can't simply tap into an AC LAN, as they can with an unprotected wireless network. And to protect your data from anyone who does manage to plug into your wiring, the HomePlug specification includes 56-bit DES encryption. Perhaps some hacker or grad student will eventually find flaws in this spec, as happened with 802.11, but there have been no such vulnerabilities reported yet.

Wi-Fi, on the other hand, has a couple of big pluses over powerline networking: First, notebook owners can add Wi-Fi just by installing a credit card-sized PC card inside their machine. It's not yet possible to make a HomePlug adapter that small, so the only choice is an external box. It's not big as computer gizmos go, but it's not something you'd want to carry around with your notebook.

Second, Wi-Fi works in appropriately equipped laptops even when they're not plugged into an outlet. That's a pretty compelling advantage.

Altogether, if the only machine you're trying to network is a notebook, Wi- Fi is probably the better choice.

(If you want to connect two or more desktop machines plus a laptop, you can easily combine Wi-Fi and powerline networks.)

-- It's also possible to build a data network over your existing phone wiring -- the HPNA (for Home Phoneline Networking Alliance) standard makes it possible for computers plugged into your phone jacks to share information even when you're talking on the phone. Products based on version 2.0 of that spec work quite well.

There are two good arguments for preferring powerline, though. Most obviously, many rooms don't have phone jacks, but almost every room has an AC outlet. Second, in many homes, phone jacks in different rooms are wired for different numbers, but HPNA networking works only among outlets that share the same circuit.

If those are not issues in your situation, you can save a little money by opting for HPNA. The list price for both of the Linksys Instant PowerLine products -- the USB Adapter and the EtherFast 10/100 -- both have a list price of $150, but they're widely available for about $100 each, and bargain hunters can find them on the Web for less than $80. USB adapters for HPNA are down to $50 or even less, while for those willing to open their PCs, HPNA PCI cards are even cheaper than that.

A note to my Mac friends: Linksys' USB powerline adapter doesn't work at all with Apple hardware, but the Ethernet bridge does -- I tried it right out of the box with my iMac, and it worked fine. Since all Macs have had Ethernet ports for years, that's not much of a problem.

There's just one caveat. All Linksys powerline devices ship with the same encryption password, so they can talk to each other. But if you're concerned about the possibility of an intruder tapping into your powerline network, it makes sense to change the password (as long as you do so for all the powerline devices on your network).

To do that, you have to run a small program called the Security Configuration Utility, and that's available only for Windows. But if you have access to even one Windows PC, you can change the password on all your powerline devices there. Then connect them back up to the Mac, and they'll continue to work normally.


IF COMPUTERS ARE SO SMART . . .
A passing thought, completely unrelated to powerline networking: If computers are so smart, how come 99 percent of the registration and ordering forms I'm constantly filling out on the Web make me enter city, state and ZIP code when they want my address? Why can't I just enter the ZIP and let the machine look up the city and state?
**********************
Government Computer News
FHWA awards a tech services pact


By Preeti Vasishtha
GCN Staff

The Federal Highway Administration has awarded a 10-year, $175 million contract to Indus Corp. to secure the agency's databases.

Under the Federal Highway Administration Information Technology Support Services contract, Indus will also help the agency with its enterprise architecture, network infrastructure, help desk, document management and telecommunications services.

The Vienna, Va., company will lead a team of subcontractors that include Centech Group Inc. of Arlington, Va.; PricewaterhouseCoopers LLP of New York; Signal Corp. of Fairfax, Va.; and York Telecom of Eatontown, N.J.
*******************
Mercury News
Net the newest place to hawk stolen goods
ONLINE AUCTION BUYERS LEFT BURNED
By Mary Anne Ostrom
Mercury News


Sharon Cooney couldn't have been happier with the new WinBook laptop she had bought on eBay -- until she learned it was stolen property. Today, the Marin County woman is out $1,075, the victim of a Maryland seller who allegedly unloaded $350,000 in hot goods through eBay.

The reach and anonymity of the Internet have helped turn online auctions into the newest way to fence stolen property.

``There's no need for the pawnbroker. Internet auctions have suddenly become a really easy way to fence stuff,'' said Frank Dudley Berry Jr., who prosecutes high-tech crimes for the Santa Clara County District Attorney.

Perpetrators range from sophisticated Silicon Valley high-tech theft rings that used eBay to fence up to $1 million in goods to a maid hawking a digital camera snatched from her client's home.

Cases include Michigan teenagers auctioning off stolen high-school band equipment and Marshall Field's window designers selling Judith Lieber handbags and other luxury goods taken from the Chicago department store.

Executives of San Jose-based eBay said the sale of stolen goods on the site, which typically has 7 million items offered, is ``a minor issue.''

Berry, the prosecutor, called eBay ``remarkably'' diligent in helping authorities track down criminals. Other online auction sites, including Yahoo, and specialized Internet bulletin boards are also used to sell stolen goods, but most reported cases occur on eBay, which accounts for about 80 percent of U.S. online auctions.

Although there's no public data on the number of stolen property auctions on the Web, interviews with police and other investigators reveal how criminals exploit the breadth and anonymity of online auctions to unload stolen goods.

High-tech rings

EBay is also a good place for authorities and burglary victims to look for purloined items. Silicon Valley companies now regularly scan online auctions looking for missing equipment. ``We find something fairly often,'' said Mark Kerby, a private investigator for high-tech companies. ``But whether we can make a case is something else.''

A tipster recently led police to an eBay seller of flash memory cards stolen from Sunnyvale data storage company SanDisk by an employee. By the time police got involved, the thief's accomplice had sold more than nearly $210,000 worth of memory cards on eBay.

``EBay is a haven for fencing because it's easy to sell stolen property and remain kind of anonymous,'' said James Healey, a member of REACT, a Bay Area law enforcement group who went undercover to bid on the memory cards.

Sometimes Internet-savvy theft victims instinctively check eBay when they find their property is missing.

After a golfer reported his stolen clubs were listed for auction on eBay, the Norman, Okla., police joined the bidding, bought the clubs and solved the mystery behind the region's rash of golf club thefts.

Brian Roberge, another victim, retrieved an old putter after the Norman bust but his set of Callaway Big Bertha clubs disappeared in cyberspace. ``At $1,000, I can't afford to replace them,'' he said.

Unlike more common types of auction fraud that are typically reported to authorities and auction sites, such as non-payment or failure to deliver goods, the fact merchandise is stolen may only be known by the thief or an accomplice.

``There is not much the consumer can do. It's next to impossible to tell if something online is stolen,'' said Holly Anderson of the National Consumer League, which helps track auction fraud for federal authorities. ``As we always say, if the deal is too good to be true, it usually is.''

In Sharon Cooney's case, she meticulously researched the WinBook's fair market value and spent a few weeks checking the eBay seller's feedback ratings from other successful bidders. Reviews were glowing.

Her winning bid -- less than $200 below retail -- seemed fair. Three weeks later, after calling WinBook about adding memory, the company, which had noted the machine's serial number, informed her it was stolen.

``I guess I was the fool for trusting eBay,'' Cooney said.

After returning the machine to WinBook, she now hopes for restitution from the Maryland seller. Most of the seller's loot came from a FedEx accomplice who pilfered packages en route to delivery.

EBay executives called experiences like Cooney's ``not at all usual.''

Anti-fraud efforts

Law enforcement authorities say eBay runs sophisticated anti-fraud operations and not only helps nab criminals but also will assist prosecutors building legal cases by sending notices to other bidders who may have bought stolen goods.

Legally, the courts have ruled eBay does not fall under auction-house rules, exempting the company from liability over an item's authenticity, value and origin. EBay insurance -- which typically covers buyers for non-delivery -- does not cover purchases of stolen goods.

Rob Chesnut, a former prosecutor who is the company's deputy general counsel, said eBay is among the most public of arenas and for that reason is not a good place to unload stolen goods.

``If you're the bad guy, you want to be discreet and low key. You would use a pawnshop or flea market, not a public site like eBay,'' he said.

Authorities with subpoenas and search warrants have access to virtually all eBay trading records.

Yahoo director of auctions Brian Fitzgerald said fencing stolen goods is ``a not a significant problem.''

Yet, online auctions can offer more anonymity than other fencing methods. Legitimate pawnbrokers, for example, must fingerprint people on whose behalf they sell items. Police routinely patrol flea markets.

EBay admits it can be stumped by bogus seller registration information and it is not uncommon for criminals to have several accounts to move large amounts of merchandise surreptitiously.

The Santa Clara County DA's office recently prosecuted a former San Francisco hotel worker who pleaded no contest to two counts of receiving stolen property obtained in heists of high-tech companies.

Mitchell Li had eight different eBay accounts. At Li's recent trial, Frank Berry showed the judge eBay records detailing how over 18 months Li conducted 1,682 auctions for goods worth more than $1 million.

Li's attorney, Bernard Bray, said his client ``unfortunately may have come into possession of some stolen goods, but he is not a thief.''

Recently, eBay has stepped up its anti-fraud activity. New initiatives include forming a separate trust and security department, which will be run by a former prosecutor with the U.S. attorney's office. It has also installed fraud detection software designed to search for patterns that might suggest illegal activity.

``Every day, there's $36 million in sales on eBay, and the great majority of purchases are legitimate,'' said company spokesman Kevin Pursglove. ``If anything makes the user suspect, do not hesitate to e-mail the seller and ask for background information.''
********************
MSNBC
Long-lost password discovered
Norwegian history database cracked with help from the Web
By Robert Lemos



June 10 A Swedish game programmer won the race to discover the password to a Norwegian history museum's database, the museum's director said Monday. The password had been lost when the database's steward died without revealing it.


OTTAR GREPSTAD, director of the Ivar Aasen Center for Language and Culture, said in an interview that Joachim Eriksson, a programmer for Swedish game company Snowcode, sent the correct password just five hours after the museum's call for help. The center had posted the database file on its Web site, asking for help in opening it.
"He used one hour to solve everything," Grepstad said. "It is a story with a happy end."
Eriksson's e-mail the first received by the center not only had the correct password, it also included the unencrypted files of the database. Later submissions also had the correct password.
The database serves as a digital catalog to a collection of more than 11,000 books and manuscripts, and the password "ladepujd" turned out to be the backward spelling of the last name of the researcher who assembled the collection.
The center had publicly requested aid from security experts on the Web last week after its employees were unable to open the digital catalog, obtained from the family of Reidar Djupedal after his death in 1989. Djupedal was a professor and an expert on Ivar Aasen, an itinerant Norwegian researcher who, in 1850, established a new language for Norway that bridged all the country's dialects.
The New Norwegian, or Nynorsk, is spoken regularly by about 20 percent of the country and is the main language in Western Norway, where nearly 25 percent of newspapers use it. The widely used Dano-Norwegian language, or Bokmål, a written language based on Danish, makes up the other 80 percent, according to the center.
Nine years ago, an archivist transferred bibliographic information on 11,000 of Djupedal's 14,000 titles to a database created with DBase III and IV, but the archivist died before the collection and the catalog reached the center, taking the password with him and leaving the catalog inaccessible. Djupedal himself had died earlier.
"We have no known information from (the archivist) which can help us solve the problem," the center lamented on the Web site, calling for help from anyone who could break the encryption on the database or find the password.
E-mail messages from more than 100 people began flooding the center on Thursday afternoon after the organization's call for aid was picked up by the media. The online request attracted a lot of attention and, reportedly, even had some parapsychologists calling to offer aid.
Previously, the center had tried to get other Norwegian librarians to help, and when that failed, hired professional computer technicians. "We tried some expert help," said Grepstad, "but it turned out not to be so expert."
That's when the center hit upon the idea of using the Web. After posting the encrypted database on its site, the center had more than 400,000 hits.
It's unknown how Eriksson retrieved the password by decrypting the database, using a flaw in the database's security to obtain access to the data, or simply by guessing. Eriksson could not be immediately reached for comment Monday.
************************
Government Executive
Proposed agency will focus more attention on cybersecurity
By Bara Vaida, National Journal's Technology Daily


Richard Clarke, special adviser to the president on cyberspace security, said Monday that the proposed new Homeland Security Department would increase the government's focus on cybersecurity.


Clarke said that the new Cabinet-level department would house the FBI's National Infrastructure Protection Center, the Commerce Department's Critical Infrastructure Assurance Office, the General Services Administration's FedCIRC and the Defense Department's National Communications System, hence increasing the cooperation among the agencies.


"It will concentrate our focus and result in better cooperation," Clarke said at George Mason University's Networked Economy Summit.

He did not comment on whether the Office of Cyberspace Security, currently located within White House offices, would be moved to the Homeland Security Department.

Clarke also noted that the government continues to work on a strategy for protecting the nation's critical infrastructure but said the date for public release has slipped to late summer or mid-September. Originally, the plan was to be released in mid-summer, around the same time as the release of White House Homeland Security Director Tom Ridge's security plan. Ridge released a part of that plan last week when President Bush announced the plan for the new department.


The critical infrastructure plan will include input from the private sector, including the academic community. Clarke said he met last week in Seattle with 35 university officials and 150 faculty members to discuss their role in protecting critical infrastructure.



The meeting also included discussion of the president's cyber corps program, which provides scholarships for students to study cybersecurity. Clarke said the 35 universities have agreed to provide cybersecurity programs, and the 150 faculty members attending the Seattle meeting endured "boot camp" training in how to teach cybersecurity.


Clarke also outlined other actions the government is taking to boost cybersecurity. Because 85 percent to 95 percent of the Internet is operated by the private sector, Clarke's staff has been meeting with boards of corporations, insurance companies and auditors to underscore the importance of cybersecurity. In addition, his staff has been meeting with information technology customers to educate them on vulnerabilities in computer software and hardware.


"We are asking these customers if they know about the security flaws and asking them why they put up with that from the IT vendors," Clarke said.


He also said his staff is working on whether a non-government entity could become an authority to certify security products so companies would have better information on whether their products properly protect their systems. "The biggest role the government can play is be a nudge" in getting the private sector to focus on security, he said.

In addition, to improve government security, Clarke said the president has asked for $5 billion in IT security products for government agencies in fiscal 2003, and "it looks good" for Congress to pass legislation that would authorize the spending.
********************
CNN
IBM reports storage breakthrough
'Millipede' similar to old punch cards


NEW YORK (AP) -- Researchers at IBM Corp. announced Tuesday the development of an ultra-dense storage technology that resembles the old computer punch cards -- except that the latest version shrinks the perforations to a molecular scale.

The technology, developed under the code name "millipede," was conceived by two scientists at IBM's Zurich research labs, who discussed the idea over beer after the company's weekly soccer games, said Peter Vettiger, the storage project's leader and one of those who conceived it.

The millipede lab prototypes can store as much as 20 times the data of the magnetic storage media used in today's computers, cramming as much as 25 million printed textbook pages of data on a surface the size of a postage stamp, the company said.

If IBM decides to manufacture millipede-based storage cards -- it has no current plans to do so -- the storage could begin replacing the current silicon-based flash memory cards in handheld computers and mobile phones by the end of 2005, Vettiger predicted.

"We see these devices in the mobile arena, in the handheld arena," Vettiger said. "There is strong desire for an increase in storage capacity, beyond what flash memory can store. There is a requirement for low power, for small size and for low cost."

Trillion pieces of data
IBM said the millipede devices can store one terabit, or a trillion pieces, of data per square inch. The data are stored in tiny sheets of plastic polymer film as tiny indentations just 10 nanometers, or millionths of a millimeter, in diameter.


Unlike punch cards, the new devices are re-writeable, meaning they can be erased and refilled over and over. Vettiger said IBM's tests have erased and refilled them hundreds of thousands of times.

Although described as a relative to the primitive punch card, Vettiger said the new memory technology is closer in design to the atomic force microscope, invented in 1986 by millipede co-designer Gerd Binnig, the Nobel prize winning co-inventor of the scanning-tunneling microscope.

Vettiger said the storage technology doesn't appear beset by the data density limits of flash memory chips, and could cram 10 to 15 gigabytes of data into a tiny format that would fit in a multifunctional wristwatch.
**************************
CNN
Flaw reported in IE browser
Microsoft says it is investigating the claim


REDMOND, Washington (AP) -- A security flaw in Microsoft's Internet Explorer browser could allow a hacker to take control of a remote computer if its user clicks a link to an outdated Internet protocol, a computer security firm says.

Oy Online Solutions Ltd. of Finland said it notified Microsoft Corp. of the security hole on May 20 but the software giant has yet to produce a software patch to fix the problem, the Toronto Star reported Tuesday.

A Microsoft spokesman who refused to be identified said Tuesday that the company is "moving forward on the investigation with all due speed" and will take the action that best serves its customers.

The problem concerns Gopher, an Internet protocol that predates the World Wide Web with pages like Web pages except that they are unable to store audio and video content.

Although Gopher is considered an outdated format for Internet content, it is still supported by Internet Explorer and most other browsers.

According to Oy Online, a hacker could take over a user's computer simply by having the user click on a link to a "hostile Gopher site." That one click would install and run any program the hacker chose on the victim's computer, and the victim might never know.

"The program could, for example, delete information from the computer or collect information and send it out from the computer," Oy Online said in a release. "(It) could also install a so-called backdoor (program) that would enable the hostile attacker to access the computer later."

Various versions vulnerable
All versions of Internet Explorer are believed to be vulnerable, the Star reported.


Refusing to confirm the security flaw, the Microsoft spokesman said the company "feel(s) strongly that speculating on the issue while the investigation is in progress would be irresponsible and counterproductive to our goal of protecting our customers' information."

And the spokesman added, "Responsible security researchers work with the vendor of a suspected vulnerability issue to ensure that countermeasures are developed before the issue is made public and customers are needlessly put at risk."

After being embarrassed on an almost regular basis by security flaws in its products -- including a debilitating problem found in its latest Windows XP operating system just days after its release -- Microsoft began a companywide training program on security issues earlier this year.

In January, Microsoft Chairman Bill Gates instructed employees to make software security a top priority.
******************
Euromedia.net
Webcasters win one battle in war over internet radio


As the webcasting industry waits for the US Librarian of Congress's decision on internet radio royalty rates, there are some signs that Recording Industry Association of America (RIAA) is looking to significantly change its original proposal to the benefit of webcasters.

One of the main issues still outstanding is what information webcasters have to give about the music they play and how often they have to report it.

In early February, the US Copyright Office announced a set of reporting requirements for webcasters that use copyrighted music that favored the music industry and threatened to make it very difficult for webcasters.

The reporting requirements were called "Notice and Recordkeeping for Use of Sound Recordings Under Statutory License."

They required webcasters to provide 18 pieces of information to copyright holders for every song streamed, including such obscure information as the "numeric designation of the place of the sound recording within the program," "the ISRC code of the recording," and the UPC code and catalog number of the retail album.

The Copyright Office requirements also included an RIAA request that many thought was an invasion of privacy of the user.

Webcasters had to supply at least seven additional pieces of information, including the date and time the user logged in and out of the stream, the country and time zone of the listener, and a "unique user identifier." This information had to be collected for every track, for every listener.

Tracking and reporting this information would have put a tremendous burden on webcasters and would require significant cost and massive effort and time to provide.

Webcasters claimed that they do not have access to much of that type of information and many simply would not be able to comply. It was considered to be another nail in the webcasters' coffin.

But in what can definitely be thought of as a step in the right direction for webcasters, the Librarian of Congress has issued a statement saying that these requirements are to be significantly reduced from 18 down to five when officially announced with royalty rates on June 20.

All of the five required data points are readily available to the webcaster and include: the name of the artist, the title of recording, the name of the album if it's available, the recording's label if it's available, and the number of times the recording is played during the reporting period.

Also of great benefit to the webcaster is how often the information is required. The announcement indicated that the information would be required on a sampling basis, that is, "for a certain period of time during each calendar quarter." Copyright holders (such as record labels and artists) had asked for "census" reporting (every song, every listener).

These are called "interim" reporting requirements. A vague time period of "several months" was given for the announcement of the "final requirements," which could include more comprehensive reporting.
************************
Sydney Morning Herald
New domain name regime to go live on July 1


auDA today announced that the new domain name regime for com.au, net.au, asn.au, org.au and id.au will go live on July 1.

Final testing of the new registry, built by AusRegistry Pty Ltd, was completed on 7 June 2002. Two independent reports (one from international registry Liberty RMS) have provided auDA with confirmation that the registry has been completed pursuant to the technical specification in the Registry Licence Agreement.

The go live date has been set to ensure enough time for an orderly transition of data from the existing registries and to allow the provisionally accredited registrars to complete auDA's Registry-Registrar interface test.

After a series of transition tests, actual data transition will take place on the weekend of June 29 and 30. During this period the domain name space will continue to operate normally but it will not be possible to register new domain names or re-delegate domain name servers.
******************
Sydney Morning Herald
Boom to bust for domain name claims
By Nicole Manktelow


It's a population in decline. More domain names are expiring than are being registered or renewed. Falling, too, is the perceived value of a prime website address.

Despite efforts to reinvigorate the market, the big shrink is a growing concern for the world's registration companies.

"The problem is decreased revenue to the registers," META Group analyst John Brand says.

"As people recognise the value of a domain name is decreasing every day, the value of the business is decreasing every day. So the question is where does that leave them?

"People will work out (that) the ability to register a domain name is a basic service."


Tom Valenta, investor relations adviser for Australian registrar Melbourne IT, says the number of domain names has been shrinking for six months.


"In some areas, it has shrunk from the land rush that occurred during the year 2000. Certainly since the September quarter, there has been a slowing down."

The number of registered domain names reportedly doubled in 1999 and tripled in 2000, but the domain name industry journal State of the Domain reported the tally of .com, .net and .org names shrunk by 378,000 in March alone. The publication estimates that in the previous six months, these groups lost about 2.5 million names or 12.3 per cent.

Valenta says new domain variants such as .biz, .info and .name could help. "They are growing at an encouraging pace. The new domains are helping - .info is approaching nearly three-quarters of a million registrations, but .name is still very small at about 68,000."

However, META Group's Brand disagrees, saying the appeal of the new domain names is limited: "We advised our client base not to enter into that at all," he says. "The value of a brand can be diluted by the number of sites around it. What you need is something simple that stands out."

In the glory days of the domain name game, speculators grabbed domains bearing brands and key words to auction to the highest bidder. Now, experts believe a "prime" domain name is becoming less important.

"As search engines and location services have improved, the reliability of search has lessened the relevance of a domain name," Brand says. "All that's really necessary is one domain with a strong brand - one strong entry point."

Companies that once bought many variations of a name are now streamlining their portfolios. Scalpers are letting some of their stock expire, and for others, what seemed like a good idea at the time is now hardly worth the renewal fee.

Melbourne IT gets about 58 per cent of its revenue from outside Australia and is the fourth-largest registrar of domains in the world, yet it has suffered less than its American counterparts. Valenta says Melbourne IT has fewer "totally speculative" names because of stricter registration policies.

Like others in the industry, Melbourne IT is hopeful that the declining number of names is a market correction, the natural death of opportunistic names registered in the boom.

Brand says speculators were the first group to start shrinking.

"A lot of them bought huge numbers of domains and are now letting them lapse," he says. "I know an Australian company that was offered a variation of their brand name in a .com. The initial asking price was $5 million. They took it as a joke. The next offer was $80,000. Eventually, the name was allowed to lapse and the company bought it for $250."

But while some generic names remain prime property, few if any can hope to be sold for record-breaking prices.

"Anyone paying anything more than $50,000 is paying too much," Brand says.

Many companies have stopped registering variations of their brands to thwart imposters or exert some control over their online publicity, he says. "They have realised that people still say what they want to. Just having the domain name is no protection," he says.

Brand says that names bought for personal ego or reasons of poor strategy are also likely to be on the expiry pile.

"A lot of domains were bought for new products that were supposed to revolutionise the world - and didn't."
********************
New Zealand Herald
Govt gives free software a run
By RICHARD WOOD
The e-government unit of the State Services Commission has surprised the Open Source community by implementing Open Source directory software in one of its projects.


Directory software holds the basic identity information to verify a computer user's identity.

Open Source software is developed co-operatively by individuals and businesses around the world and provided free for use, to modify and redistribute.

Open Source will also get a look-in against competing vendors' products when the e-government unit considers which software to use for authenticating business and public access to Government services online.

NZ Open Source Society spokesman Peter Harrison said Open Source usually missed out on Government deals because the software did not have an associated vendor to respond to requests for information.

"I'm impressed that some of these Government departments are looking beyond vendors," said Harrison. "The Government purchasing guidelines do not specify looking at Open Source."

The e-government unit supports the email side of the Government's Secure Electronic Environment (SEE) system for interdepartmental communication.

Program architect Brendan Kelly said the SEE Mail project required a directory so agencies could find who was using the secure email service.

OpenLDAP, the software being used by the commission, has been tested running on Open Source operating system Red Hat 7.1 Linux.

An e-government online report said the unit was "very impressed with the stability, consistency and reliability of the product", but OpenLDAP is mentioned only as a stopgap measure.

The report said an Open Source directory presented a "viable starting point deferring the need to select a particular directory vendor, allowing valuable experience to be gained in the implementation and usage of a directory for minimal outlay."

Kelly said OpenLDAP was a viable option for further use within the Government.

Unit head Brendan Boyle said the e-government unit considered Open Source software every time it needed a solution.

He said looking at Open Source was not just about cost.

"Cost would be an implication obviously, but the bottom line is that whatever we do has to meet the need."

Kelly said the use of OpenLDAP was not intended to replace any existing Government department's directories and the unit had consulted major vendors to ensure the system was inter-operable.

"In developing the policy we worked with Microsoft, Novell and Solnet to cover the major flavours of types of directories," he said.

"All three vendors were confident it would not cause them problems." "

Boyle said the expertise gained from using OpenLDAP would also be useful for the wider authentication system that would be required for government-to-business and government-to-citizen systems.

That is a deal all the major directory vendors will be wanting to win.

Peter Revell, country manager of one such vendor, Novell, said he was not familiar with OpenLDAP capabilities but it had not been a serious challenger.

"In the debates and strategic thinking with [our] customers regarding meta directory strategies and identity management, OpenLDAP is not on the radar screen at all."

The Inland Revenue Department has Novell software providing authentication for 16,000 employers. That deal was talked about last year as being able to be extended to the general taxpayer.

Boyle said that although having one directory for public access was the most efficient, any arrangement was more likely to involve a number of directories due to privacy considerations.

S.E.E. Directory: Paper 1 - Viability of open source options in the directory space, see http://www.e-government.govt.nz/docs/see-directory-paper-1/
*******************
New Zealand Herald
Russians create 3D power map
By RICHARD WOOD


Transpower is one year into a long-term project to survey its power lines nationwide and turn the results into a three-dimensional computer model.

A Russian company is using a laser system that measures the location of the lines in three dimensions from a helicopter.

The system monitors the reflection off the wire at 50,000 times a second and combines that data with information from a highly sensitive gyro stabilisation and satellite positioning package in the helicopter.

The result, once processed in Russia, is a computer-aided design file modelling the power lines in 3D.

Transpower uses the information to identify the height of wires above ground. The more power sent down a line, the more it heats and sags.

But there is a maximum limit a line is allowed to sag for public safety reasons.

Transpower's general manager, service delivery, Kieran Devine, said the company used sensing equipment to measure wind speed, air temperature, solar radiation at its substations and other sites as required.

"Out of that we do a calculation which tells us what the maximum sag on the line will be and that tells us either the capacity we've got left in it or that we are over capacity," he said.

"As we get more experienced, we may put measuring gear at specific sites midway on the line.

"If we have a particularly tight line with a known constriction, we can put the measuring gear right at the point and therefore run it a bit closer to the limit."

The whole project could take three or four years to complete as the company works through the network.

The first benefit is that Transpower will be able to identify where it is cost-effective to make physical changes such as tightening lines to lift them higher.

The ultimate goal is a "variable line rating" system, whereby the rated capacity that each line can handle can be adjusted frequently depending on environmental factors.

At present, lines simply have a summer and winter rating. Some critical lines could ultimately have a rating that changes every 10 minutes, maximising their throughput when required.

This system will use high-tech, point-to-point radio systems which are being produced by Wellington firm 4RF.
*********************
Sydney Morning Herald
Opening the Open Source debate - II
By David F. Skoll


The Alexis de Tocqueville Institution (AdTI) has finally published its white paper entitled "Opening the Open Source Debate". My earlier comments were based on media reports and e-mail correspondence with the paper's author. This document was written after I read the actual white paper. (The original link seems not to work; I managed to grab a copy of the paper before AdTI pulled it. This link may work.)


The AdTI's very weak and poorly-researched paper opens no debate. It simply confirms that Microsoft paid AdTI to come up with something - anything - to stem the growing adoption of open-source (especially GPL'd) software by business and government.


Let's take a look at the paper in detail.

I. In the Beginning

Section I, "In the Beginning", gives an overview of proprietary vs. free software. It's reasonably accurate, although the author is given to rather ludicrous depictions of source code as a "secret formula" and a "map to a buried treasure."

II. GPL Open Source The Gift That Keeps Taking

Section II is where Microsoft vents its anger. Take a look at this gem:

The GPL is one of the most uniquely restrictive product agreements in the technology industry.

Why does Microsoft... excuse me, the AdTI... say that? They say that because:

The GPL requires that if its source code is used in any type of software product (commercial or non-commercial) for any reason, then the entire new product (also known as the derivative) becomes subject to terms of the GPL open source agreement.

This is not quite true; if you do not distribute your derived product, then you do not need to distribute the source code. But for the most part, the statement is accurate.

But so what? Suppose you derive a product from Microsoft Windows or some other proprietary code. Then you are breaking all kinds of license agreements. Furthermore, proprietary vendors would demand and get the rights to your derived product, leaving you with nothing.

The GPL is no more restrictive than the most liberal of proprietary licenses, and a good deal less restrictive than most. So Microsoft's... excuse me, the AdTI's... complaints are groundless.

Another quote:

David Wheeler, publisher and expert in Washington on open source and proprietary source comments, without licensing the source code in a multi-license format, (referring to other more permissive licenses), it is impossible for GPL to work for a proprietary business model.

Perhaps the AdTI misses the point. GPL advocates do not care if GPL'd software can be made to work in a proprietary business model. It's not our problem. There's no God-given right for proprietary software vendors to make money; they have to compete. And if the rules of the marketplace suddenly change and make it difficult for them, well - tough. Adapt or die. Don't moan.

III. The Myth of a Public Software Community

Section III attempts to debunk the "myth" of a public software community.

The AdTI hints that open-source advocates abandon their principles when they smell money:

Widespread support for GPL open source lies in the IT community's frustration with competitive, closed proprietary software. But in fact, it is quite common that programmers experiment with open source until they see an opportunity to capitalize on an idea, then embrace proprietary standards. One could joke that open source has been a bridesmaid but never a bride. The story of the web browser is an example of this reality.

AdTI uses the story of Netscape "killing" the open-source Mosaic. Well, Mosaic was never GPL'd. If it had been, Netscape would have been unable to kill it. Furthermore, AdTI says of Mosaic:

Through a commercial partner, Spyglass, NCSA began widely licensing Mosaic to computer companies including IBM, DEC, AT&T, and NEC.

Conspicuously absent from AdTI's list is another licensee: Microsoft. Yes, Spyglass's browser formed the basis for Internet Explorer. And revealed here is Microsoft's reason to fear the GPL: It cannot make use of the work of thousands of dedicated programmers for free, locking the work up in a proprietary product. It did that with early versions of its TCP/IP stack, derived from the Berkeley stack. But as more free software is GPL'd, Microsoft's cherry-picking opportunities diminish. Isn't it sad?

The AdTI never quite gets around to saying why the open-source community is a "myth". Apparently, the hundreds of collaborators who gave the world the Linux kernel are mythical. Perhaps the outstanding KDE desktop environment was written by unicorns. And one supposes that GNOME, another outstanding desktop environment, was produced by, well, gnomes. Apache - it's a myth. PHP - doesn't exist. Mozilla - pshaw.

Even in my own modest software development, I've had contributions from dozens of people around the world to my software packages. I've had suggestions, fixes, enhancements and pieces of wisdom donated to me which would never have happened in a proprietary development environment.

IV. The Government and the GPL

This is where politicking gets into high gear.

However, the use of the GPL has the potential to radically alter a very successful model for partnership, particularly when most large commercial entities do not readily embrace the GPL.

Once again, the white paper is worried about "large commercial entities." Well, some large commercial entities like HP/Compaq, IBM, Dell and Sun are quite willing to use, produce and/or distribute GPL'd software. To those large commercial entities who wish to stop GPL'd software, I say: "Tough. Adapt or die."

Needless to say, the government could not depend on patches for software glitches to wander in from the public. Likewise, the government could only use open source code that it could independently service in case of an emergency. Agencies without extensive staff to maintain its internal operations cannot afford to use hapless and untested software without accountability, warranties or liability.

This is a complete red herring. Patches don't "wander in" from the public for open-source products. Rather, they come straight from the authors, or sometimes from distributors such as Red Hat. Furthermore, they tend to come in with a lot more alacrity than fixes from commercial vendors.

With open-source, the government at least has an option to be able to "independently service" the software in case of emergency. With proprietary software, the government does not even have this choice. Therefore, the AdTI's objections on this ground are spurious.

Another consideration for the U.S. government is that all source code developed under the GPL could have mirrored availability to the public. This poses unlimited security issues.

AdTI loves this refrain, but has yet to prove it. In my other article, I debunked the myth that source code availability necessarily introduces security issues, and demonstrated that in fact, it can often enhance security. I was interviewed by AdTI for my opinions on the matter; they neglected to include my comments in the paper.

For example, if the Federal Aviation Agency were to develop an application (derived from open source) which controlled 747 flight patterns, a number of issues easily become national security questions such as: Would it be prudent for the FAA to use software that thousands of unknown programmers have intimate knowledge of for something this critical? Could the FAA take the chance that these unknown programmers have not shared the source code accidentally with the wrong parties? Would the FAA's decision to use software in the public domain invite computer hackers more readily than proprietary products?

Again, a ludicrous example. No-one simply sits down and "develops" such an application by starting with free software. Even if the FAA did develop an open-source flight-control application, AdTI has not demonstrated at all that it would have significantly-different security issues than a closed-source one. Sure, AdTI asks a bunch of rhetorical questions. But that's not how one conducts a logical argument. So let's answer the rhetorical questions with some of our own:

Would it be prudent for the FAA to use software that thousands of unknown programmers have intimate knowledge of for something this critical?

Is it prudent for any federal agency to use Microsoft software, given that it is a matter of public record that Russian hackers illegally broke into Microsoft's network and had access to source code? Is it prudent for any federal agency to use software which is not freely-available for peer review? Is it prudent for any federal agency to take the word of a proprietary vendor that its software is secure, given that the vendor is attempting to make a sale?

Could the FAA take the chance that these unknown programmers have not shared the source code accidentally with the wrong parties?
Will the FAA ban the use of Microsoft software, given that it is a certainty that Microsoft source code has been shared "accidentally with the wrong parties"?
Would the FAA's decision to use software in the public domain invite computer hackers more readily than proprietary products?
Will the AdTI comment on why proprietary Web servers seem to be cracked far more often than open-source ones, even though they have smaller market share?


Reverse Engineering

Experts differ on whether the primary focus for security should source code or binary code. Andrew Sibre, a programmer with over twenty years of experiences insists, "Having a license for binaries only gives you a black box : you don't know what it's doing, or how, unless you want to go insane trying to reverse-engineer it with a debugger (illegal under the term of most licenses)" Having the source lets you see what it's doing, how it does it, and permits you to modify it to meet your particular requirements (including security related ones). To this extent, government officials should be concerned that threat may not just be an adversary cracking their system, but inadvertently educating adversaries about their security systems. Sibre continues, "Depending on code without the source is quite similar to depending on a complex mechanical or electronic system without the benefit of shop and parts manuals."
Naturally, having access to source code eases reverse-engineering. However, the vast majority of security exploits are found without access to the source code. As I wrote to Ken Brown, the author of the report:


The entire premise of computer security and encryption is as follows:

A security system must be resistant to attack *even if* the attacker has all the details about how it works. I refer you to:

"Applied Cryptography", Bruce Schneier, John Wiley and Sons, Inc, page 3:
"All of the security in these algorithms is based on the key (or keys); none is based in the details of the algorithm. This means that the algorithm can be published and analyzed. Products using the algorithm can be mass-produced. It doesn't matter if an eavesdropper knows your algorithm; if she doesn't know your particular key, she can't read your messages."


I refer you also to:

"Practical UNIX and Internet Security", Simson Garfinkel and Gene Spafford, O'Reilly and Associates, pages 40-45:
"... This is especially true if you should find yourself basing your security on the fact that something technical is unknown to your attackers. This concept can even hurt your security."


I refer you to an Internet draft on security through obscurity:
http://www.ietf.org/internet-drafts/draft-ymbk-obscurity-00.txt

A few more links on why security through obscurity does not work:
http://www.zdnet.com.au/developer/standards/story/0,2000011499,20265619,00.htm

http://www.treachery.net/~jdyson/toorcon2001/

http://www.counterpane.com/crypto-gram-0205.html

http://online.securityfocus.com/columnists/80

http://www.vnunet.com/Analysis/1126488

Is Reverse-Engineering That Hard?

Before I started Roaring Penguin Software Inc., I worked at Chipworks, a company which does reverse-engineering for a living. From first-hand experience, I know that hardware and software security can be broken more easily than most vendors believe, and much more cheaply, too.

Back-Doors

Ken Brown raises the old back-door bogeyman:

Another security concern is that the primary distribution channel for GPL open source is the Internet. As opposed to proprietary vendors, open source is freely downloaded. However, software in the public domain could contain a critical problem, a backdoor or worse, a dangerous virus.
The following material is taken straight from my other article, where I already covered the back-door issue:
In fact, there have been some trojans placed in open-source software. They have usually been discovered and neutralized very quickly. By contrast, closed-source products have a sad history of spyware, "Easter eggs", and questionable material, placed by people who have (presumably) been "screened." In fact, one of Microsoft's own security updates was infected with a virus, something which (to my knowledge) has never happened in the open-source world.


An interesting back-door was one in Borland's closed-source Interbase product. This back-door lay undetected for years, but was revealed within weeks of the product being open-sourced.

And another interesting little "easter egg" is on the AdTI's very own Web site.

Questionable material in Microsoft software may have helped spur a Peruvian bill to promote free software in government. The author of the bill says that open-source software provides a better guarantee of "security of the State and citizens" than proprietary software, an analysis which is 180 degrees out of phase with the AdTI Study.

The real "victims" of the GPL

The government's productive alliance with private enterprise is also relevant particularly when its decision to use GPL source code would inherently turn away many of its traditional partners. Security, as well as other impracticalities make GPL open source very unattractive to companies concerned about intellectual property rights. In effect, the government's use of GPL source code could inevitably shut out the intellectual property based sector.
The Government must choose software to maximize national security and minimize government expenditure. It owes absolutely nothing to the "IP-based sector" or any other corporation. What was it I said before? Oh, yes: "Tough. Adapt or die."
This has a number of ramifications. Immediately, it would limit the number of qualified vendors to choose from to deliver products.


Tough. Adapt or die.

The GPL's wording also prevents the equal use of software by members of the IP community and the GPL open source community.

This is a lie. If the "IP community" (whatever that is) respects the terms and conditions of the GPL, it's as free as anyone else to use and distribute GPL'd software. If it doesn't like the terms of the GPL, that's the "IP community's" problem, not the GPL's problem.

A worse consideration is that use of GPL could inadvertently create legal problems. IP community members could argue that the government's choice of open source is restrictive and excludes taxpaying firms from taxpayer-funded projects. Adverse impact would include a discontinued flow of technology transfer from government-funded research to the technology sector. Without value, it becomes highly likely that government funding for research would slow as well.
Here, AdTI is delivering a veiled threat on behalf of Microsoft. First of all, if "IP community members" could argue that, they already would have. They have not made the argument because they know it is specious. In fact, there's a very good argument for requiring the fruits of government-funded research to be GPL'd so that all citizens can benefit.


Furthermore, the "IP community members" have benefited from government research as much as (or more than) government has benefited from private research. So to pull out of government partnerships out of pique over software licensing would only hurt proprietary vendors and no-one else. V. Intellectual Property Left

This is a rewording of "Free Software is Communism" and merits about the same amount of serious attention.

U.S. intellectual property (IP) statutes have been a beacon for inventors around the world. The U.S. model for motivating, compensating and protecting innovators has been successful for almost 200 years. GPL source code directly competes with the intellectual property restrictions, thus it is vital to analyze its impact.

The GPL does not in any way "compete" with U.S. copyright law. It uses U.S. copyright law in a perfectly legitimate and reasonable way.

There are two groups of programmers that contribute to the open source community. The first group consists of professionally hired programmers by day, who freely contribute code. The second group consists of original equipment manufacturers (OEMs) that are hiring open source programmers for their products. However, open source principally perpetuates itself because there is an avid pool of experts and enthusiasts willing to spend their spare time to provide fixes and modifications to open-source software. This volunteer system works well as an academic model, but not as a business one.

Who cares about business models? We have Linux, Apache, Mozilla, Gnome, KDE, Perl, Python, PHP, FreeBSD, OpenBSD, NetBSD, and so on in spite of the supposed lack of a business model. What we see here is more whining from proprietary vendors about how free software is hurting their business model. Let's hear the refrain: "Tough. Adapt or die."

As mentioned earlier, open source code is not guaranteed nor does it come with a warranty.

Neither does most proprietary software, so this is a red herring. If you want a warranty, most open-source vendors will be happy to provide one if you pay for it.

Open source products are often distributed without manuals, instructions or technical information. While a commercial developer is obligated to produce manuals, diagrams and information detailing the functionality of their products, open source programmers are not. In addition, open source developers cannot be expected to create software manuals with the vigor of private firms that are obligated to produce them. Producing technical specifications (in soft or hard copy format) is time-intensive and expensive. But this is not just a customer service issue.

Some open-source software comes with poor documentation, just like some proprietary software. Other free software comes with excellent documentation. It's a matter of customer choice: Choose software that has what you need.

All of my free software products come with complete manual pages. Most serious developers do not consider software finished until the manuals are finished.

Innumerable questions surround the distribution of technical information in the copyleft environment, particularly because the Free Software Foundation has a copyleft license for its documentation as well. Issues include: Who should have the right to alter software manuals? Who is the final editor or is there one? How should changes be regulated? Are manuals copyright protected documents? What is the process for making changes? What body regulates these changes? How can organizations guarantee that information in manuals is always accurate?
More rhetorical questions. With proprietary software, if the manuals are inaccurate, you're out of luck. With free software, you at least have a chance to correct them.


Again, we see the unease of the proprietary vendors who want bodies to "regulate" changes. They are unable to wrap their minds around the new reality of free software. Rather than changing their ways, they dig in their heels. They may need another reminder: Tough. Adapt or die.

Today, software impacts a firm's financial health in an intimate fashion. It becomes unrealistic for a firm to depend too much on the trust of an anonymous community that does not have anything at stake financially to keep important technical documents current.
On the contrary, it is imperative that businesses rely solely on free software for access to critical information. Only in this way can they guarantee access to their data, and not be held hostage by proprietary file formats and proprietary vendors. To quote Dr. Edgar David Villanueva Nunez, a Peruvian legislator:
To guarantee the free access of citizens to public information, it is indispensable that the encoding of data is not tied to a single provider. The use of standard and open formats gives a guarantee of this free access, if necessary through the creation of compatible free software.


To guarantee the permanence of public data, it is necessary that the usability and maintenance of the software does not depend on the goodwill of the suppliers, or on the monopoly conditions imposed by them. For this reason the State needs systems the development of which can be guaranteed due to the availability of the source code.

To guarantee national security or the security of the State, it is indispensable to be able to rely on systems without elements which allow control from a distance or the undesired transmission of information to third parties. Systems with source code freely accessible to the public are required to allow their inspection by the State itself, by the citizens, and by a large number of independent experts throughout the world. Our proposal brings further security, since the knowledge of the source code will eliminate the growing number of programs with *spy code*.

In the same way, our proposal strengthens the security of the citizens, both in their role as legitimate owners of information managed by the state, and in their role as consumers. In this second case, by allowing the growth of a widespread availability of free software not containing *spy code* able to put at risk privacy and individual freedoms.


In fact, Villanueva's eloquent and well-written letter handily demolishes most of AdTI's premises and conclusions; it's well worth a read.


More on Reverse Engineering
The reliance on reverse engineering is probably one of the biggest conflicts between the IP and the GPL open source community. To keep GPL products relevant and up to date, GPL enthusiasts must perpetually reverse engineer intellectual property.
Reverse-engineering is required only if hardware manufacturers keep the details of their software/hardware interfaces secret. The vast majority of hardware manufacturers do not keep them secret. Some which do keep them secret provide (binary-only) drivers for free software systems. Reverse-engineering is necessary only for the small minority of hardware devices which are secret.
Reverse engineering has a number of implications. It harbors very close to IP infringement because and has staggering economic implications.
Reverse-engineering is perfectly legal. In fact, the European Union has a law guaranteeing the legality of reverse-engineering for the purpose of creating compatible software or devices. AdTI implies that reverse-engineering is "close to IP infringement", but they never say why (and their sentence doesn't even parse.)
If software is freely re-engineered, it will inevitably impact the value of software on the market. If the price of software is adversely impacted, salaries and inevitably employment of software programmers would be negatively affected as well.


This is correct. If software is freely re-engineered, it destroys monopolies and brings back a sense of free market to the industry. Yes, software prices go down. And yes, consumers benefit.

The whole paragraph is simply a thinly-hidden Microsoft lament about the success of products like Samba which enable companies to run Microsoft-compatible file sharing without exorbitant Microsoft licensing fees. VI. Is the GPL Cost-Beneficial?

This is a restatement of the tired old "TCO" straw-man.

Discussing the economic implications of open source, Andre Carter, President of Irimi Corporation, a technology consulting firm in Washington comments, "The question of open source code is about whether the software developer wants to make available to the world the blueprint of what they built or simply the benefits of what they built. The notion of open source software has nothing to do with free software. The purchase price of computer software is only a fraction of the total cost of ownership. So even if the price tag reads free , it can end up being more expensive than software you buy. This is especially true for the typical consumer. If it requires technical know-how to operate, doesn't offer built-in support, and demands constant attention, it won't feel free for very long."
Lot's of "if's" and weasel-words in there. If it requires technical know-how to operate, etc, etc. Nowhere does Carter say that free software does in fact require any more technical know-how than proprietary software. Furthermore, proprietary software often has hidden costs which can come back later to haunt you.
The success of an A-Z open source environment would expectedly impact the software sector as a viable entity. If software is freely available, but PC s, servers and hardware maintain their value, we can only predict that the value of software companies will plummet. Hardware will come with more and more free software. Second, we can only expect that the revenues and value of the software sector will transfer to the hardware sector. Although the software sector has seen growth almost every year, it is questionable whether the GPL model will enable the software industry to continue its exceptional growth particularly when the growth in the software sector is tied to proprietary products, something the GPL is anxious to eliminate.
In the 1800s, black-smithing was a pretty good profession. In the 1960s and 1970s, 8-track tapes did a pretty good business. The fact is that the black-smith industry and the 8-track tape industry failed to heed the iron rule of the market: Adapt or die. If free software means the death of proprietary software vendors, it will be on those vendors heads who fail to adapt.
Businesses must be concerned about the perception of the GPL. For example, experts assess the value of intellectual property when completing valuations of firms. Because GPL open source literally erases the proprietary and trade secret value of software, it can be expected that firms concerned about valuations will be very concerned about using GPL open source.
This is only of concern to firms producing software. The vast majority of firms consume software, and for them, in-house software production is a cost, not a revenue source. For the vast majority of firms, free software will save them lots of money. For those few firms planning on building a business model around proprietary software, I offer my old refrain: Adapt or die. What's good for proprietary software vendors is not necessarily good for the citizen.
There are all types of consumers with ranges of needs and abilities. The guys in the lab at MIT don't need install wizards, plug and play drivers, voice based technical support and big picture manuals as part of their software. However, the elderly couple e-mailing their grandkids or the mother of two managing accounts on a PC in the kitchen does.
Carter clearly has a stereotyped view of consumers. My elderly parents, who enjoy e-mailing their grandkids, use only free software. They are quite happy to use Linux and Netscape. Furthermore, the choice of free software eases my support burden: If my parents need help, I can SSH into their machine and fix it remotely. With all of Microsoft's "wizards" and other gimmicks, they still do not provide a convenient means for remote administration on their consumer-level systems.


People believe free software is hard to use because they've never used it. Just as the AdTI showed that people who've actually worked with MCSEs have a higher opinion of them than people who haven't, people who've actually bothered to use free software have a higher opinion of it than people who haven't.

VII. GPL Open Source and the Courts

Once GPL code is combined with another type of source code, the entire product is GPL. Subsequently, this change could occur deliberately, but it could also occur accidentally. There are unlimited scenarios for accidents to occur, the license could be lost in the source code's distribution, or maybe unreadable due to a glitch in its electronic distribution. Another potentially litigious issue is whether the use of GPL tools used to manipulate code subject software to the GPL. Theoretically, a GPL tool could subject new software to GPL restrictions. This too will have to be interpreted by a judge. Regardless, unknowing users of GPL might have one intention for use of the license and find out later that it inadvertently infringed upon copyright protected work. Legal questions relevant to such an event intersect the legal arenas of intellectual property rights, contract law and liability.

AdTI is very good at offering up red herrings. Let's suppose you "accidentally" included part of Microsoft Windows in a product. Do you suppose Microsoft would be easier on you than copyright holders of a GPL'd product?

The fact is that any software license has terms and conditions which must be obeyed. The GPL is no different; if you do not like its terms, don't use GPL'd software. Microsoft's agenda is transparent here.

The proprietary software industry entreats you to diligently track licenses, and offers harsh retribution against those who violate their licenses. Most GPL violations are settled amicably, and those which result from an accident are usually settled merely by removing the offending code from distribution.

The rest of Section VII is simply speculation and not even worth commenting on. VIII Conclusion

Open source as a development model is helpful to the software industry. For example, software distributed under the BSD license is very popular. The BSD license (see Appendix 9) enables companies, independent developers and the academic community to fluidly exchange software source code.

English translation: The BSD license is good because it allows corporations to benefit from other people's work without offering them any compensation, and without having to allow third parties to benefit from derived work.

The GPL's resistance to commonplace exchange of open source and proprietary has the potential to negatively impact the research and development budgets of companies.
English translation: The GPL doesn't let corporations benefit for free from others' work.
The GPL has many risks, but the greatest is its threat to the cooperation between different parties who collaborate and create new technologies. Today, government, commercial enterprise, academicians, etc. have a system to converge. Conversely, the GPL represents divergence; proposing to remove the current infrastructure of intellectual property rights, enforceable protection and economic incentive.
English translation: The GPL threatens Microsoft's business model. You know my response by now: Tough. Adapt or die.
While GPL advocates are quite active in their promotion of copyleft, few would disagree that its widespread adoption would present a radical change to an industry sector responsible for almost 350 billion dollars in sales annually worldwide (see Appendix 10).
Few would disagree that the automobile all but wiped out blacksmithing as a profession. Few could argue that cassettes didn't decimate the 8-track market. Few would be surprised at my response: Tough. Adapt or die.


AdTI's Numbered Points and my Counterpoints

1- Engineering software has become considerably complicated and rigorous. It is not unusual for software to include millions of lines of source code. If the incentive to develop software is changed, we can subsequently expect the quality and efficiency of software to change.
Yes, with luck, we'd expect the quality to improve. The security records of systems like OpenBSD, Linux, and FreeBSD are vastly superior to that of Windows. While there is no real cause-and-effect relationship, empirical evidence suggests that open-source software is more reliable and of higher quality than most commercial-grade proprietary software.
2- There remains considerable differences within the GPL open source community. It is questionable whether these groups will continue to be proponents of the GPL in its current form or opt for changes in the immediate future.
Even if true, this point is irrelevant. Once software has been licensed under the GPL, the license cannot be retracted. Your rights cannot be withdrawn retroactively (unless you violate the license), unlike some proprietary software licenses.
3- Open source has successfully been used in proprietary software. In addition, academic and government projects have been successful particularly because of commercial interest. Private enterprise offers unique efficiencies for the success of government funded research.
Simply another attack on the GPL. Nothing worth reading; let's move on.
4- Open source GPL use by government agencies could easily become a national security concern. Government use of software in the public domain is exceptionally risky.
A bold assertion, and totally unproven. This assertion is contradicted by empirical evidence. Also, the NSA seems quite comfortable with the security of GPL'd software.
5- Reverse engineering, perpetuated by GPL proponents, threatens not only the owners of intellectual property, but also the software industry itself.
This is an out-and-out lie. Reverse-engineering is critical for the continuation of a healthy software industry. Without legitimate reverse-engineering, there would be no market forces to oppose the development and maintenance of monopolies, and the software market would become even more unfair than it is today.


Attempts to ban reverse-engineering are simply money-grabs by greedy monopolies who wish to hang on to their power.

6- Use of GPL open source creates a number of economic concerns for firms. For example, the valuation of a software company could be significantly effected if it uses source code licensed under the GPL for the development of its products.
If that is of concern (and it is not for the vast majority of corporations), then the corporation is perfectly free not to use GPL'd software.


Using proprietary software for development of products can also significantly lower a company's valuation, especially if the owner of the original proprietary software demands royalties or part-ownership of the resulting IP.

7- The courts have yet to weigh in on the General Public License. Without legal interpretation, the use of the GPL could be perilous to users in a number of scenarios.

If corporations have concerns about legal interpretations of the GPL, they should consult qualified lawyers. IBM, for example, has a massive and top-notch legal team, and they seem to have no qualms about using, creating and distributing GPL'd software. If the AdTI would give us concrete examples of legal concerns, we could discuss them, but as it is, all we are given is conjecture, hand-waving and supposition. Roaring Penguin's Conclusions

The AdTI claimed that the GPL is "acquisitive", yet fails to note that even the most liberal of proprietary licenses is far more restrictive and places far more encumbrances on derived products than the GPL (if, in fact, it even permits derived products in the first place.)

The AdTI says that the free software community is a "myth", but fails to explain the tens of millions of lines of high-quality code produced by this mythical community.

The AdTI promised to show how using GPL'd software could threaten security, but failed to deliver. Rather, Microsoft's own Jim Allchin admitted under oath that flaws in Microsoft software, if disclosed, could endanger national security.

The AdTI claims that free software damages members of the "IP community" (by which it means proprietary software vendors), but then fails to show how such damage occurs. Even if free software does damage proprietary software vendors, AdTI fails to show why that is a bad thing for citizens in general.

AdTI raises the hoary old "Total Cost of Ownership" issue, but does not demonstrate that proprietary software is more cost effective. AdTI ignores studies like the one from CyberSource or even Roaring Penguin's own case studies in Free Software in the Real World.

The entire AdTI study is a commercial funded by Microsoft, whose sole aim is to counter the growing adoption of GPL'd software. The report contains nothing constructive or useful. It is a sham.

Link to the article http://www.smh.com.au/articles/2002/06/11/1022982836568.html
Link to Open the Open Source debate - I http://www.smh.com.au/articles/2002/06/10/1022982813264.html
Alexis de Tocqueville Institution white paper on Open Source Software
http://www.adti.net/html_files/defense/opensource_whitepaper.pdf
********************
Wired News
Browsing Around for New Targets




Jeffrey Zeldman and the Web Standards Project are back with a wake-up call for Web developers everywhere: The problem today isn't Microsoft or Netscape - it's you.

The Web Standards Project, or WaSP, was founded in 1998 by a group of high-profile Web developers -- including Zeldman -- who were tired of building different versions of every page to support a plethora of incompatible browsers from Microsoft, Netscape and others.

"The term 'Web standards' didn't even exist back then," Zeldman said from New York, where he still works as a Web consultant.

But succeeding versions of browsers became more compatible with standards and each other, and WaSP scaled back operations in 2000.

Now, Zeldman says the time has come to address the other, possibly tougher roadblock to universal Web accessibility: those who build sites, not browsers. "Though today's browsers support standards, tens of thousands of professional designers and developers continue to use outdated methods" for architecting and building online content, says the mission statement on a new version of the WaSP site to be launched Tuesday.

The result, according to WaSP's statement, is a locked-out audience: "Highly paid professionals continue to churn out invalid, inaccessible sites" unreadable by surfers using off-brand browsers, wireless Web devices or special-access technology for surfers with disabilities.

As site owners become more concerned about government-mandated accessibility (known colloquially in the United States as "Section 508," after the relevant section of the Rehabilitation Act amended by Congress in 1998), Zeldman says businesses are simultaneously looking for ways to reach a wider market - meaning customers not sitting at a desktop computer - without spending on additional coding projects.

By conceiving and coding Web content in line with standards defined by the World Wide Web Consortium (aka W3C), Zeldman says, "You don't have to build three or four versions of each page. You don't have to make printer-friendly versions. That's important on the $40,000 contracts we're getting today," as opposed to multimillion-dollar sites built during the dot-com boom.

But compared to getting a few companies to play along, convincing an industry of consultants to focus on standards compliance may be more of an uphill task for the resurrected WaSP. "It's a long battle to convince people who are billing by the hour to change the way they work," Zeldman said in a phone interview from Manhattan.

"We plan to lovingly guide our peers toward accessibility and standards compliance," he said. "And if that fails, we plan to guilt-trip them. And if that fails, we will ridicule them mercilessly, as we once ridiculed Netscape and Microsoft."

If developers won't listen to Zeldman, they might pay more attention to a newer member of the WaSP's steering committee: Tim Bray, who co-invented XML for the W3C in the late 1990s and now runs Vancouver-based Antarcti.ca Systems, a company that does XML-based data visualization.

"I'd like (Web developers) to look at the reality of the modern Web," Bray wrote in an e-mail. "Standards-compliant browsers with beautiful rendering, ADA section 508, the Internet Explorer monoculture under assault from Gecko and non-PC-form-factor devices, Web Services. I'd like them to conclude that the only sane, sensible, affordable way forward is to go standards-based on all their content."

"At one time standards were an optional extra; I just don't think that's true any more."

Bray agrees with Zeldman that a tighter economy should make Web consultants think about their methodologies. "The U.S. Federal Government is one of the good guys in what they say, if (unfortunately) not always in what they do," he said. "But at the end of the day, I think the leadership comes from the market: It's safer, cheaper and better to build your Web presence in a standards-based way rather than otherwise."

Even some of the WaSP's former targets support the renewed efforts. "WaSP is right in pointing out that websites are behind in using standards that are supported by all current browsers," said Håkon Lie, CTO of Opera Software in Norway. "Most major websites can also be improved by removing intricate table layouts and superfluous markup. There is absolutely no need to use font elements anymore."

But one HTML contractor, who asked not to be named, illustrated the uphill battle the WaSP faces in getting programmers to lay aside their old browser-specific tricks: "Do you know how much I get paid for knowing this stuff?"
******************
Wired News
New E-Waste Solution a Mine Idea


Mark Small has a radical solution for dealing with the glut of old computers, cell phones, DVDs and other electronic waste: mining.

Rather than allowing electronic junk to simply amass in landfills, Small wants to deposit huge volumes of e-waste into abandoned open pit mines.

Using the same techniques that miners use to process copper ore, valuable materials such as copper, iron, glass, gold and plastic could be extracted from electronic scrap.

"The products that we make (in the electronics industry) are made in an incredibly efficient way," said Small, vice president of corporate environment, safety and health for Sony Electronics. "We have to use that same type of philosophy in dealing with the end-of-life issue."

There are more than 550,000 abandoned hardrock mines in the nation. A single open pit mine has the capacity to hold 72 billion computers, Small said.

Instead of viewing obsolete computer monitors and televisions as defunct sources of cathode ray tubes, "We should look at it as a commodity," Small said.

Mining is the "best, most efficient way to process materials with low value content (like e-waste)," Small said. "There's essentially no cost in this."

Electronic waste isn't that big in terms of quantity or volume. In fact, mining produces approximately 300 times more waste than electronics does every year, Small said.

E-waste is often richer in rare metals than virgin materials, containing 10 to 50 times higher copper content than copper ore. A cell phone contains five to 10 times higher gold content than gold ore, Small said.

Small's solution could be an alternative to small-scale regional recycling programs that are employed today.

While some vendors have employed successful domestic computer take-back programs, other manufacturers ship electronic junk overseas to China, India and Pakistan.

By using mining techniques, "there's no need to ship e-waste overseas," Small said. "You can process it right here."

E-waste mining can take place anywhere. The process could be done on a large concrete pad, rather than an old pit mine.

Small says e-waste mining is both economically and environmentally sustainable. The method could be used only in mines that are void of groundwater problems.

"It would be foolish to process in an area that could cause a contamination problem," Small said.

But the cost of collecting and transporting material limits the sheer volume of e-waste that is ready to be processed.

"The big issue is we don't have enough material now," Small said. "As soon as we get more material, this could be done."

Only a system that allows for mass production will be able to address the proliferation of new technologies that may soon become obsolete, Small said. "We need a system to handle the 20-30 million TVs that are produced a year," Small said.

Small's idea could mark a dramatic shift from the traditional method of disassembling electronics.

"It's definitely out-of-the-box thinking," said Steve Changaris, northeast region manager for the National Solid Wastes Management Association.

Still, it could take 5 to 10 years to accumulate enough materials to be mined.

If a huge volume of e-waste material can be aggregated to reach a critical mass and mining can be implemented cost-effectively, then Small's proposal could succeed.

"Some state or some jurisdiction is going to try this," Changaris said. "I believe there's a distinct possibility that this could work."

But while Small's idea could be less costly than traditional electronics recycling, critics say that it could add to the mining industry's devastating impact on the environment.

"I don't believe that the environmental community would allow us to dump our electronics and add to the mining waste," said Robin Ingenthron, vice president of ElectroniCycle Inc.
Small's proposal "shines a light on how atrocious the standards for mining are," Ingenthron said.


Even improper recycling poses less of an environmental threat than mining raw materials to produce the rare metals found in electronics, Ingenthron said.

According to the Environmental Protection Agency's 2000 Toxics Release Inventory, the hardrock mining industry is the nation's largest toxic polluter. In 2000, the mining industry released 3.34 billion pounds -- or 47 percent -- of all toxics released by U.S. industry.

"Pound for pound, open pit mining is the most-subsidized, most-polluting, lowest-employment-generating business in the world," Ingenthron said.

Mining precious materials from the Earth requires 30 percent more energy than recycling them from old computers. Those economics have caused recycling practices to constantly improve as nations develop.

"Sure, we need to shut down a few lazy recyclers," Ingenthron said. "But even the worst recycling sites are superior to mining. Mining is much harder to remediate and make right. It's fairly easy to recycle correctly."

The General Mining Law of 1872 established current mining standards. The law allows companies to mine publicly owned minerals for free and pay no more than $5 an acre for mineral-rich lands. If that law is repealed, then mining for e-waste might be more costly.

"The only reason why (mining e-waste) sounds like a good idea is because of how cheap it is to meet the standards of the 1872 law," Ingenthron said.

"As long as the 1872 Mining Law gives the gold, copper, silver and lead away for free, we will have to charge a fee to recycle that material from used electronics. We should be able to recycle for free, on the value of the rare metals."

Although Small's idea would produce less waste than getting raw material out of hard rock, re-use in secondary material in glass and plastics would be lost, Ingenthron said.

"Of course recycling would be cheaper if we lowered our standards to those of the mining industry. You could drop monitors into open pits of arsenic and cyanide, and walk away from the residue after you got the copper and gold out. But it makes more sense to raise mining up to the standards of recycling."
********************
NewsFactor
'Grid' Computers to Simulate Terror Scenarios


Researchers say a futuristic computing technology will help government agencies prepare for worst-case scenarios involving terrorist attacks. For the complete story, see: http://www.newsfactor.com/perl/story/18168.html
********************
NewsFactor
IBM To Announce Chip Technology That Drain Less Power


IBM recently designed a chip for Cisco that contains more than 35 million "gates," or logic circuits. IBM claims it is the most complex custom-made semiconductor ever built. For the complete story, see: http://www.newsfactor.com/perl/story/18140.html
********************
Peoples Daily China
China, Japan Signed Memorandum on Internet Technology Cooperation


The Chinese and Japanese governments have signed a memorandum of understanding Thursday in Tokyo on Internet technology cooperation.

The Chinese and Japanese governments have signed a memorandum of understanding Thursday in Tokyo on Internet technology cooperation.

Headed by Wang Chunzheng, deputy-director of the Chinese State Development Planning Commission (SDPC), the Chinese economic delegation exchanged views with Japanese officials on macroeconomic policies, social development, as well as economic and technological cooperation of the two countries during its visit to Japan on June 5-6.

After one-year discussion, the SDPC and the Japanese Ministry of Economy, Trade and Industry have reached a common view on internet technological development and cooperation and signed the memorandum of understanding on internet Ipv6 technology cooperation.

This marks the new development on information technology cooperation between the two countries.

According to the memorandum, concerned departments and enterprises of the two countries will input a total investment of some 4 billion Japanese Yen (125 Japanese Yen =1 US dollar) for developing the Ipv6, a new generation internet technology.
*******************
The Jerusalem Post
Israeli companies not adequatedly prepared for disaster
By SHARON BERGER


Despite the dramatic events of September 11, companies have not taken certain lessons to heart, in particular the need for backing up information at alternative locations and implementing a data continuity plan.

Israeli technology companies, which are heavily involved in research and development, would appear to be an obvious choice for making great efforts to ensure the security of their intellectual property and the smooth functioning of their operations even under problematic conditions.

Portfolio companies of venture capital funds are being increasingly encouraged to ensure their data is safely backed up on a regular basis. Yet at least one local lawyer believes that Israeli companies are not only exposing themselves to unnecessary risk but are also opening themselves up to be sued for not taking preventive action.

International business development lawyer Abraham Sofer, from Tel-Aviv based Lipa Mier and Co., notes that under common law companies can be liable for not exercising good business judgment, which includes management not preparing for steps they should have foreseen. Although Sofer considers such
a case highly unlikely he did not discount shareholders of American-registered companies holding companies liable for having put their
management in jeopardy if war breaks out in the region.


In such a case the successful operating of the company could be endangered and therefore the
shareholders may have a case said Sofer.


Furthermore, Sofer noted that companies have a legal requirement to implement a contingency plan to avoid financial losses. "Any companies not taking steps to prevent such disasters are not exercising business judgement," he said. He noted that many insurance companies offer discounts to companies which have comprehensive recovery capabilities. In the US
following September 11 companies looking to renew their policies were strongly encouraged by their insurance companies to implement such plans so as to minimize their exposure to future potential disruptions. Included in these contingency plans are the need for off-site back up capabilities.


Although the continuing violence between Israelis and Palestinians has not abated, and the threat of increased escalation exists with a possible attack by the US on Iraq, few experts believe that companies here need to go so far as to back up information outside of the country.

Motorola which has over 2,000 engineers in Israel, has "no framework for backup abroad," said spokesman, Uri Ginosar. However a number of local companies do have back-up storage in remote locations, such as the desert.

Sofer explains that with the technical capabilities available to avoid lengthy computer downtime, those who do not take adequate steps open themselves up to a potential wave of litigation. He has a checklist for companies of suggested steps including: management involvement, a detailed plan in writing, which has been tested, back up on a daily basis with at least one copy off site, an understanding of contracts with outside sources and their liabilities in case of disaster, as well as having alternate sources lined up.

Back-up services, standard IT procedure in larger companies, are generally provided by conglomerates such as IBM, Sony, Hitachi, on hard drives and other large storage devices.

They are designed to protect companies from
problems such as fires, flooding, electricity black-outs, earthquakes, etc. to the physical site, which could cause disturbances in their data continuity. Particularly for companies that need continuous access to their data, such as online trading sites, disruptions can be disastrous warned Michael Beldie, a business consultant for Partners 500. The rather "unsexy" field of data storage recovery is expected to grow to $4.7 billion worldwide by 2005 he said.


The need for back-up services was heightened by the September 11 attacks, but local business consultants note that particularly in large organizations, it is also necessary for cases of former angry employees, who are also capable of causing extensive damage. Israel Kahana, director of business continuity solutions at E&M Computing in Ramat Gan, also warned of "fingeritis" where employees accidentally disable applications or delete vital information, particularly common in the process of installing new software. This is in addition to more mundane difficulties such as system failure or security violations leading to corruption of data.

Beldie notes that some 60% of most companies' downtime results from hardware problems.

According to Beldie and Kahana, in order to ensure an adequate plan for business continuity it must be implemented not only be technical staff, but include the involvement of top level management.

"Enlightening management or the board of directors must come first in order to kick start the process.

They are the ones with the authority to resign resources, both financial and human."

Beldie refers to a study by the Gartner Group that shows that although some 85% of large corporations have some kind of disaster recovery plan, only 15% have some kind of business continuity plan. Of those most have only an outline rather than a detailed response plan.

"There is a large gap between perception and what steps actually need to be taken." Sofer warns that companies must not only have these plans in place but they must ensure that they actually work.

He admitted, "September 11 bought disaster recovery to the forefront of people's minds. Now corporations are looking at all kinds of preparation."

Kahana believes some 60-70% of the Israeli market is now considering implementing some sort of back up information plan, but warned that this realization would take a while to translate to increased business. Beldie and Kahana recommend that all companies examine what would be the impact of a disruption to their data services.
*******************
Computerworld
Mass. roundtable panelists talk up broadband, Web security


BOSTON -- Massachusetts IT leaders had two main messages yesterday for Bruce Mehlman, the U.S. Department of Commerce's assistant secretary for technology policy: They want to see universal broadband access and greater IT security.
Panelists at a Mass eComm roundtable discussion here yesterday said that broadband could change the commercial landscape of the country by breaking traditional telecommunications companies' hold over the flow of information.


Economic growth depends on access to broadband connections, said Dan Bricklin, founder and chief technology officer of Concord, Mass.-based Trellix Corp. He criticized the telecommunications industry for its restrictive practices in the past, noting that touch-tone phones were introduced in 1964 but people had to pay 90 cents per month to use them. That seemed like more of an attempt to make money off consumers than to advance technology, Bricklin said.

Cambridge, Mass.-based ChannelWave Software Inc. President and CEO Chris Heidelberger cited his own experience. He recently moved from a community with high-speed Internet access to one that didn't. Before the move, his entire family used the Web for almost all its needs, from shopping to banking. In their new home, the family has only a slower dial-up account -- and has reverted to its pre-Internet lifestyle.

According to Heidelberger, if business is going to really develop on the Web, it will need broadband to do so.

Linked with that access is the need for better security. Panelists at the session told Mehlman that their biggest concerns are attacks on the IT infrastructure that would damage consumer confidence and erode gains already made in the public's acceptance of e-commerce.

Although some panelists said they fear another massive Sept. 11-type attack, others, such as Simson Garfinkel and Bricklin, are more worried by the Web's brittleness.

"We shouldn't live in a monoculture," said Garfinkel, author of several IT books and CTO at Sandstorm Enterprises Inc., a security company in Cambridge, Mass. Garfinkel said there is an overdependence on Microsoft Corp.'s products. The Windows operating system has its place in the world, he said, but to be truly safe, businesses and the IT infrastructure as a whole need diversity. This means more Linux- and Macintosh-based operations, he said.

Bricklin echoed Garfinkel's call and said many security measures put forth by companies and vendors remind him of the man who searched for his wallet under a lamppost because the light was better -- even though he lost it somewhere else.

Garfinkel was even more direct, saying he seldom flies anymore -- not because he fears a terrorist attack, but because of what he called meaningless exercises in airport security designed more for show than preventing an attack.

He and Michon Schenck, president and chief operations officer at Concord, Mass.-based Financial Fusion Inc., both noted the massive amount of security around the World Trade Center to show how security can be circumvented.

Garfinkel said his backpack was searched every time he went into the building -- even though the 1993 attack on the World Trade Center came from a truck bomb. That type of security, he argued, was wasteful and ultimately useless.

"We didn't see an attack in civil aviation; we saw an attack on a building," he said of the pre-Sept. 11 preparations.

Panelists said the same could be said about the approach many people take to IT security. "Inconvenience people, and they will think you are doing something," Garfinkel said.

He and the others warned that future cyberattacks could be more devastating because worms and viruses have so far been designed to cause minor disruptions and serve as a nuisance. However, they said, if such attack carried more deadly payloads, the damage to the nation's economy could be a lot worse.

"The people who are really good at doing this don't have the incentive, and the people who have the incentive usually don't have the knowledge," Garfinkel said.

He also chided Internet service providers for not doing enough in terms of security screening and scanning for viruses, worms and other attack agents.

Curt Lefebvre, founder of NeuCo Inc. in Boston, said consumers also have to get involved by demanding that businesses take security more seriously.

The government also has a role to play, said Financial Fusion's Schenck. Noting that federally-chartered banks must meet regulatory criteria to stay in business, she said there aren't any federal criteria to measure security for online banking.

"Online banking is not even on the list," she said.
***************
Computerworld
Clarke: Homeland security revamp to help cybersecurity

RESTON, Va. -- White House cybersecurity chief Richard Clarke said today that a plan to reshuffle the federal government's cybercrime agencies into a new cabinet-level homeland security department will improve federal coordination with the private sector.
"It will concentrate our forces, it will concentrate the skilled staff that we have and will ensure better cooperation and better coordination both within the government and the private sector," said Clarke.


In a proposal outlined by President Bush late last week, the new department would include the FBI's National Infrastructure Protection Center and the U.S. Department of Commerce's Critical Infrastructure Assurance Office. Both agencies work extensively with the private sector.

Clarke, speaking here at the Networked Economy Summit sponsored by George Mason University, also warned that the dangers posed by worms, viruses and system intrusions are as urgent as ever -- and on the rise.

"Digital Pearl Harbors are happening every day, they are happening to companies all across the country," he said. According to Clarke, such cyberincidents cost the economy $15 billion last year.

Clarke and other federal officials have been holding a series of meetings around the country to raise awareness and gather information for a planned national strategy due out by mid-September. That strategy, which is being developed with the help of industries representing critical sectors such as finance, energy and transportation, is intended to map out a plan for improving security protection.

But the government awareness campaign has also been "a little dirty," Clarke told his audience, many of whom work for IT companies in Northern Virginia.

In particular, federal officials have been going to private-sector companies and telling them to pressure vendors to improve security with this message: "Why aren't you using security offerings as a discriminator among the people from whom you buy?"

Clarke said he has also been meeting with insurance companies about writing cybersecurity insurance for firms that meet certain criteria.

A key goal is improving the security of federal agencies, which have frequently been found to be lacking by the congressional watchdog agency, the U.S. General Accounting Office.

In that regard, the Bush administration's proposed budget for next year includes $5 billion in new funding to improve security at federal agencies. Clarke said the private sector won't take the federal government seriously as long as the government itself has problems.

This was good news for the vendors at the conference.

"There is a tremendous opportunity for private-sector involvement in homeland security areas," said Rep. Tom Davis (R-Va.), who predicted "billions of dollars" of new federal IT spending on homeland-related security.

The bulk of this new spending "is not going to new federal employee manpower, but is going to contractors, innovators, information technology companies," said Davis.

The proposed homeland office reorganization won the endorsement of one vendor, Jack London, chairman and CEO of CACI, a Northern Virginia-based IT firm. He said it will allow the government to produce "a single data picture of threats against our homeland."

But one technology effort that "should command early focus" is development of interoperable identification control systems that would allow federal agencies to work with law enforcement, as well as the private sector, to correlate potential terrorist activity and threats.

Virginia's economy relies heavily on the tech sector, which employs about 325,000 people in the state. It's also home to numerous military bases and network hubs that handle Internet traffic.

"Virginia is a target-rich state," said the state's governor, Mark Warner. "Literally, half of the Internet traffic in the world flows through Northern Virginia. A disruption to that traffic could have worldwide implications."
************************
ZDNet
EU: MS Passport is under investigation


BRUSSELS--The European Union is examining charges that Microsoft's .NET Passport system breaks EU rules on data privacy, a European Commission official said on Tuesday.
The official said he expected member states to make a formal announcement after July 1.


The official was commenting after Microsoft called a news conference to deny the Commission or any EU state was formally investigating the .NET Passport system of collecting personal data from Internet users.

"Not one of the 15 member states or the European Commission is investigating Passport," Peter Fleischer, senior attorney for law and corporate affairs at Microsoft, told reporters.

He was referring to reports last month that the EU was looking into whether the system complied with strict EU rules on data protection.

The Commission official confirmed no formal investigation had yet started but suggested action may be in the pipeline.

"It is correct to say there is no formal investigation at the moment," he said.

"But the issue is under consideration and we are hoping for some kind of announcement from the member states after July 1."

EU national privacy controllers, the officials charged with monitoring compliance with EU privacy laws, are due to meet on July 1, the official added.

Privacy matters
Earlier this month the EU executive said in a letter to a member of the European Parliament it was "a matter of priority" for it to look into Microsoft's .NET Passport service in co-operation with EU member states.


Several national privacy controllers said last month that associations aimed at protecting privacy had been asking governments to open an investigation and could get their way.

Fleischer said Microsoft company was in a "constructive dialogue" over this issue with the Commission and EU states, but would not give details of what specific concerns the Commission had about .NET Passport.

Under EU data privacy rules, customers' personal data can only be used by a firm or passed on to others with prior consent from the individual.

While the Commission has authority to help member states interpret EU law, legal action would be launched by the individual member states.

Privacy groups and their allies argue that Microsoft's free .NET Passport service collects personal information while consumers are making purchases, playing games or doing bank transactions on line.

Richard Purcell, Microsoft's Corporate Privacy Officer, denied the allegations, saying that customers give data on a voluntary basis. Such data normally include usernames, passwords, email addresses and, in some cases, phone numbers.

Any investigation would be separate from a probe by the competition arm of the Commission, which is looking into Microsoft's Windows operating system, alleged to work better with its own server software than those of its rivals.

Microsoft said last month it had also supplied information to the Federal Trade Commission in Washington, which has received similar complaints.
***********************
Sunspot.net
Blind gain free access to 53 newspapers online


Program: The blind are now able to dial an 800 number and have dozens of newspapers read to them by computer.

There was a time when Michael Baillif, a tax attorney in Washington who is blind, would have to ask the person who reads for him at work to spend the first few minutes of the day reciting the sports pages.

Or when he felt he had to scramble to keep up with news on the radio, rather than having access to the newspaper at his leisure.

"If I was going to read anything at all, I'd have to pay a reader to do it," Baillif said. "And what's more, I'd have to do it on their schedule."

Those days are long gone, thanks to technology at the National Federation of the Blind that allows people to dial into a computer and have the paper read to them by the machine.

It's part of the Baltimore-based federation's free Newsline program, which has spread nationwide.

That means blind people around the country can dial an 800 number and have any of 53 newspapers read to them by computer. Before the program went national, listeners could hear news only from papers in their area or had to pay for a long-distance call to use the service.

Baillif uses Newsline to listen to The Sun, the Washington Post, the New York Times and, sometimes, USA Today and the Wall Street Journal.

"I've got a lot more flexibility," said Baillif, 35. "At work I often enjoy grabbing lunch at my desk and listening to the paper over lunch."

Here's how Newsline works:

The National Federation of the Blind gets an electronic feed from each newspaper with that day's articles (advertisements are not included), which is then translated into a computer-generated voice. It takes six to 15 minutes for the computer to translate the feed into a voice.

Users can then call a server at the center and navigate through the system using their telephone keypads. They can choose what paper to listen to, then what section. The voice tells them how many articles are in a section and begins to read the first article in the section.

The users have the option of skipping ahead to other stories, fast-forwarding or going back in a story, searching for a word in an article or having a word spelled out for them. Newspapers for the current and previous day are available in the system, and the Sunday editions are stored for a week.

"We wanted to build something that could provide instant access, timely access and with a minimum amount of effort," said Peggy Chong, program manager for Newsline.

Newsline began as a pilot program in 1994, with USA Today as the only newspaper participating. The paper was accessible by calling a local number in Baltimore. In 1996, the program expanded to include a half-dozen newspapers that could be heard by calling servers in Louisiana and Minnesota.

As the program grew, newspapers were fed into local servers in different regions. Chong said the servers were kept anywhere the National Federation of the Blind could find to house them - from public libraries to an insurance company to a laundry room in a nursing home.

Each server was limited to seven papers. If an area had no servers, people had to call long-distance to use the system.

This year, Newsline went national with the help of a $4 million grant from the Institute of Museum and Library Services.

Chong said that when the program began it was a challenge to persuade newspapers to participate. Some worried that circulation would go down because people would use the service instead of buying the paper, she said.

But the number of participating papers is rising, as is the number of users. Newsline is hoping to have 75 to 100 papers online by the year-end, and the program has more than 36,000 participants, Chong said. Newsline is also hoping that users will gain access to government documents through the service.

Leon Rose, 77, who is blind, lives in Columbia and plans to move with his wife to Nevada for the winters. He said the national reach of Newsline would help him stay on top of Maryland news from his second home.

"I will be able to continue to stay in touch with what's going on in Maryland on a toll-free basis," he said.

Baillif, the tax attorney who used to have his reader at work read the sports section to him, said he feels the same way about Newsline as he does about cellular phones.

He had vowed never to get a cell phone, but now that he has one, Baillif says he couldn't live without it. Similarly, he hardly noticed the lack of access to information before he used Newsline, but now that he uses the technology, he finds it's a fundamental part of his life.

"What's tremendous about it," he said, "is it's a wholly independent way that I can obtain access to information the way everyone else does."
*********************


Lillie Coney
Public Policy Coordinator
U.S. Association for Computing Machinery
Suite 510
2120 L Street, NW
Washington, D.C. 20037
202-478-6124
lillie.coney@xxxxxxx