Volume 4, Issue 432: Monday, December 9, 2002
“Communications Executives Endorse Security Regulations”
Associated Press (12/07/02); Kellman, Laurie
A 300-item to-do list released on Friday by the Network Reliability and Interoperability Council recommends what the communications sector should do to ensure network security against terrorist attacks. The panel said that these suggestions should be deployed voluntarily by industry, although FCC Chairman Michael Powell acknowledged that some could be implemented by government regulation. Council recommendations included closer monitoring of networks to make sure idle computers are shut down and therefore closed off to hackers; developing plans for communications equipment and plant inspections in case of disaster; training personnel to manage proprietary information and identify suspicious items; implementing firewalls, perimeters, and other “layered” security structures; accelerating attack response times; and ensuring that networks and information are not accessible to unauthorized personnel. The council also advised that ISPs and cable, wireless, and data companies voluntarily disclose power outages as part of a 2003 pilot program. The purpose of all these recommendations is to enable companies to maintain customer service operations by collaborating with partners or rivals during crises, and demonstrate that industry can address security issues with little government regulation.
Click Here to View Full Article
“E-Mail Overload Is a Myth, Study Says”
Washington Post (12/09/02) P. E5; Henry, Shannon
The Pew Internet & American Life Project says not as many U.S. workers feel overwhelmed by email as is widely thought. In all, around 60% of 1,003 workers surveyed by telephone reported getting 10 or fewer email messages per day. Just 4% of all people who use email at work reported receiving too many messages every day, while 65% of those interviewed said email was not a problem whatsoever. Senior Pew researcher Deborah Fallows said she was surprised by the results, and surmised that a small but vocal minority of email users are what she calls “power emailers.” According to the study, 11% of those heavy users felt their inboxes were deluged with email everyday. Gartner analyst Maurene C. Grey said the Pew study was suspect, given the high profile of email growth and the success of email management software, such as the anti-spam software from Brightmail. That company reports that spam email totaled 35% of all incoming messages as of September 2002, but that it was just 8% of incoming messages one year earlier. However, spammers target individual users of free email accounts such as those from Yahoo!, AOL, and Microsoft, since addresses are easy to guess. Corporate email accounts have become harder targets for spammers because of increasingly spam-savvy CIOs and better email management software, according to Pew.
“Wanted: a Few Good Hackers”
Wall Street Journal (12/09/02) P. R7; Dreazen, Yochi J.
The federal government is eyeing hacking as a way to solve some of its computer security problems, including helping it find critical software vulnerabilities and stopping the trade of pirated digital content. White House cybersecurity advisor Richard Clarke this summer appealed to hackers at the annual Black Hat hacker convention, asking them to aid the country in closing security holes. He said hackers should first contact firms about their software weaknesses, but turn to the government if they elicited no response. In order to protect such do-gooding hackers, Clarke also said the government would consider granting legal protections to those that reported successful hacks. On another front, Rep. Howard Berman (D-Calif.) continues to push legislation that would give movie and music companies the legal right to hack peoples’ PCs and look for illegally owned copyrighted material. If found, the copyright owners would be authorized to disable those files through unspecified means, though the bill says no viruses can be released and that the companies must first notify the Justice Department of their actions. Perhaps a more common attack on illegal file-trading would involve denial-of-service attacks that would knock out Web sites that distribute pirated content. Critics of the bill say that gives too much leeway to copyright owners and does not contain enough protections against overreach. Meanwhile, Rep. Rick Boucher (D-Va.) has introduced a bill that would legalize copying files that are already legally owned, as long as those copies are for personal use. Rep. Zoe Lofgren (D-Calif.) proposes to allow legally owned content to be copied and played on any device in another bill.
“Wireless Research Senses the Future”
ZDNet (12/06/02); Charny, Ben
Deborah Estrin, director of UCLA’s Center for Embedded Network Sensing, foresees future applications of wireless technology that will dramatically change the way people handle resource management, transportation, and medicine, to name a few areas. She predicts that wireless technology will not only become smaller, faster, and inexpensive, but smarter because of a convergence of wireless communication, sensory technology, and computation. Estrin explains that her facility is concentrating on developing self-healing, networking devices with automated power management, which feature sensors and actuators so small that they can be poured into concrete or sewn into fabric. Researchers are working to address the scalability problem by setting up a system with a distributed architecture that processes signals locally, then collaboratively, using the nervous system as a model. Other projects the center is working on include tiny wireless transmitters distributed over every inch of a region that allows researchers to monitor microclimates, and “smart” buildings with embedded seismic sensors and wireless technology. Estrin considers making wireless systems self-healing to be the biggest technical challenge, and believes that commercial products will move out of the research phase and into the market over the next few years. She expects wireless technology to proceed both incrementally and with big breakthroughs, and agrees that regulation will be necessary for medical, public health, and safety-oriented applications. Estrin praises Intel’s wireless initiative, which includes projects within the company and at three university laboratories.
“Digital Copyright Law Up for Challenge”
Medill News Service (12/06/02); Madigan, Michelle
With the deadline for submitting public comments on the Digital Millennium Copyright Act (DMCA) to the Copyright Office coming up in less than a week’s time, experts are advising people on how to best present their case for exemptions. Computer programmer Seth Finkelstein, who authored a proposal submitted during the first review in 2000 that was subsequently passed, recommends that cases must be “extremely detailed and factual.” Specific lawsuits and applications should be cited, he adds. Finkelstein successfully proved the validity of allowing people to access secret Web site blacklists contained in Internet filtering programs. He also plans to submit a new request for a DMCA exemption, noting that the surge of litigation could work in his favor. However, Finkelstein acknowledges that Internet filtering program vendors will be on the lookout for challenges during the current review period. Copyright Office officials recommend that comments should be made in the proper format; in the first two months of 2003, the office will be receptive to responses to the initial round of public comments. The Copyright Office will present suggestions to the librarian of Congress, whose judgments should be announced next October. According to law, DMCA exemptions must prove that users are “adversely affected in their ability to make noninfringing uses due to the prohibition on circumvention of measures that protect access.”
To read more about DMCA, including ACM’s arguments against, visit http://www.acm.org/usacm.
“Lawmaker Reviving Failed E-Waste Bill”
SiliconValley.com (12/08/02); Puzzanghera, Jim
California Sen. Byron Sher (D-San Jose) intends to reintroduce a bill that would attach a $10 fee to the purchase of new computers monitors and TVs in order to fund national e-waste recycling programs. This comes on the heels of Hewlett-Packard’s change of heart, after its earlier opposition to the same proposal prompted Gov. Gray Davis (D-Calif.) to kill the bill. The monitors covered by the bill contain cathode ray tubes, which are a source of hazardous lead. A similar bill from Rep. Mike Thompson (D-Napa) would cover other toxic materials besides lead, and require the fees collected from the sale of electronics to be channeled to the EPA, which would allocate grants to businesses, state and local governments, and other organizations to establish recycling programs. HP, however, wants Thompson’s bill to be restricted to cathode ray tubes. Meanwhile, Rep. Zoe Lofgren (D-San Jose) announced that she is considering presenting her own proposal at the next congressional session in January, and added that consensus between Democrats, Republicans, and industry will be critical if any national e-waste recycling legislation is to be passed. However, Thompson admits that addressing the growing problem of e-waste is not yet a pressing matter for congressional Democrats or Republicans. All computer manufacturers agree that a national policy, rather than computer-recycling laws that vary from state-to-state, is the best solution, notes Heather Bowman of the Electronic Industries Alliance.
“I.B.M. Plans to Announce Tiny Transistor”
New York Times (12/09/02) P. C4; Markoff, John
IBM researchers this week will report on a transistor that measures just nine nanometers long; a typical human hair is at least 3,000 times wider than a nanometer. The researchers, led by Meikel Leong, will present their work at the annual International Electronic Device Meeting, opening today in San Francisco. The researchers say the new transistor will extend semiconductor performance projections through 2016 or longer, and is necessary for studying electronics at the sub-atomic level. It is not clear that transistors can be made much smaller, due to instabilities. Moreover, as transistors shrink, other problems emerge, such as excessive heat, because the transistors are packed together more tightly. Still, IBM’s transistor, which is one-tenth the size of the most advanced chips currently in production, would make possible faster digital logic functions and high-capacity memory.
(Access to this site is free; however, first-time visitors must register.)
“ICANN At-Large Reps Can Keep Jobs”
Wired News (12/06/02); Kettman, Steve
ICANN will extend the terms of its elected board members into early 2003 as part of the ICANN reform transition process. ICANN board member Karl Auerbach says that ICANN is doing so for appearances only, and he predicts that elected board members will be asked to stay on for at least several more months. ICANN CEO Stuart Lynn says no one is pressuring ICANN to keep these particular board members, and Lynn adds that ICANN will create a transition board to map ICANN’s reform implementation. Lynn says that ICANN meetings are held overseas in order to reach out to the global Internet community, and that the meetings are not just fun trips to exotic places, as some critics contend. Lynn adds that board members could be elected in the future if practical issues concerning how to conduct online elections fairly and fraud-free are figured out. ICANN critic Michael Geist says that ICANN’s real challenger is the International Telecommunication Union (ITU), “which continues to be very active in bringing together participants in the Internet governance field without much regard for what ICANN is up to.”
To read more about ICANN, visit http://www.acm.org/usacm.
“Robot Space Cowboys”
Wei-Min Shen and Peter Will of the University of Southern California’s Information Sciences Institute (ISI) have proposed a project in which robots using “hormonal” software would assemble a space station without human intervention. Under the aegis of their CONRO initiative, the researchers have developed working units of modular robots that can unite into larger groups or split up into smaller ones thanks to software that allows “bifurcation, unification, and behavior shifting,” according to Shen. Their SOLAR space station proposal calls for two species of robots to carry out construction: Solar-powered, self-configuring units that transmit signals to a second species of robots, which do the actual assembly. The second variety, which are known as “free-flying intelligent fiber rope matchmaker units” (whips), are comprised of two robots linked by a connector line whose length can be adjusted; using solar-powered rockets for propulsion and GPS sensors for self-location, the whips will respond to signals from the subassembly units, lock onto them, and tow them close enough for coupling to take place. Financial supporters of Will and Shen’s proposal include NASA, the National Science Foundation, and the Electric Power Research Institute (EPRI). The researchers detailed their plan at the Robosphere 2002 conference last month. The robotic units and the hormonal software are now undergoing trials at the ISI Polymorphic Robotics Laboratory.
“Software Giants ‘Trample Freedoms’ “
BBC News Online (12/06/02); Ward, Mark
Richard Stallman continues to call for an end to proprietary software, and as president of the Free Software Foundation he is also leading an effort to develop the Gnu operating system that can be used without restriction. The open source movement has learned plenty from Stallman’s efforts since 1984, but he suggests that open source advocates do not realize using software is an ethical matter. Stallman says the concept of software ownership by the likes Microsoft, IBM, Sun, and Adobe stifles innovation because it does not allow users to find out how applications work as well as work with others to improve their products. What is more, Stallman maintains software ownership is a policy that takes away basic freedoms and human rights of users. And record companies and film makers are now using the tactic to keep consumers from freely copying music and movies, he adds. “A whole generation has grown up with the idea that it is normal for them to have no freedom,” says Stallman.
“New Year to Bring Nastier Viruses Yet”
Computerworld New Zealand Online (12/04/02); Greenwood, Darren
Virus expert Daniel Zatz, a consultant at Sydney-based Computer Associates, predicts that next year’s viruses are likely to be more dangerous than ever. Some 250 viruses have emerged every month in 2002, while 400 per month were released in 2001, but the latter viruses have proven more destructive, he says. The most harmful is the Klez virus, which has been modified at least eight times. He believes many viruses are being written by older, unemployed software developers in Eastern Europe who are fine-tuning their skills. Zatz adds that new viruses such as Klez, Bugbear, and Braid often consist of older code that is “cut and pasted” in. Klez, for example, leaves another virus (Elkern.cav) as a by-product, he says. Many viruses in 2003 will also target Microsoft, he predicts. Zatz advises companies to use email gateways to filter out emails with .exe or .bat file extensions, use software patches, and install user-behavior tools that monitor suspicious employee activities; an example of physical security “colliding” with IT security.
“Clothes Make the Network”
Technology Review Online (12/04/02); Rheingold, Howard
Wireless wearable technology could spur the formation of what Lancaster University’s Gerd Kortuem calls “ad hoc communities” in which people with similar interests and tastes can network and participate in social activities–resource sharing, gaming, knowledge exchange, etc.–via software agents programmed to carry out interactions based on wearers’ preferences. “The biggest effect of mobile wireless computing devices will become visible only after large numbers of people [who are strangers to each other] start using the technology to engage each other,” says Kortuem. His vision began at the University of Oregon, where he and other team members created ad hoc community programs such as Walid and mBazaar. From these experiments, it was concluded that ad hoc community applications must be capable of identifying nearby parties, querying and comparing data, and maintaining contacts. This led to the development of the Proem peer-to-peer platform, which is composed of 135 Java commands keyed to facilitate spontaneous social organization; networks of mutually trusting devices can be formed using the platform by enabling agents to exchange data about past transactions when they share data about current transactions. Kortuem says researchers’ most pressing need at this point is “a next-generation wireless device specifically designed for ad-hoc wireless connectivity.” His research at the University of Oregon proved that the technology is viable for small-group interactions, but his stint at Lancaster University will focus on large-scale interactions. Kortuem foresees that technology integrated with clothing could support ad hoc communities within the next 10 years.
“Is Big Brother Our Only Hope Against Bin Laden?”
Salon.com (12/03/02); Manjoo, Farhad
The U.S. Defense Department’s Total Information Awareness (TIA) program is an ambitious effort to collate all personal data–business transactions, relationships, registrations, etc.–on foreigners and citizens in an effort to spot suspicious activity that could precede a terrorist attack. The technical problems of implementing it are formidable: To know virtually everything, a practical impossibility, requires a massive amount of data; database integration difficulties will complicate deployment, while keeping track of databases will also be troublesome; and TIA’s mandate to monitor seemingly normal activities that may be statistically construed as possible preparation for a terrorist incident–in real time, no less–could generate many false positives. This raises the question of whether the system should narrow its focus to time-honored terrorist characteristics, or follow more generalized behavior patterns in order to thwart new kinds of attacks. Stanford University computer scientist Jeffrey Ullman, for one, thinks TIA is essential to civilization’s survival, because information technology has the potential to root out evil and shield personal freedom. Other researchers are hopeful that federally supported programs such as TIA will result in more overall research funding. However, civil libertarians are calling the project an invasion of privacy, and are also displeased in the appointment of former national security advisor John Poindexter to lead the program for the Defense Advanced Research Projects Agency (DARPA), which is sponsoring TIA development. Meanwhile, statistician Bobby Gladd sees the project as a waste of time and resources that could be put to better use in furthering information analysis. Nevertheless, computer scientists believe TIA could be acceptable to the public, as long as its technology and policies are publicly debated, and its policymakers are trustworthy.
“When the Web Starts Thinking for Itself”
VNUNet (12/02/02); Green, David
The semantic Web of Tim Berners-Lee is viewed as a way to fundamentally improve searching and data exchange on the Web. Using technologies such as eXtensible Markup Language (XML), Resource Description Framework (RDF), ontologies, and intelligent agents, the semantic Web would have the power to process and analyze large amounts of information, and electronic appliances such as mobile phones will be able to use it to advertise their functionality. In particular, intelligent agents will give the semantic Web its “proactive personalization” as thousands of interactive bots delegate information requests, creating an information value chain. The agent bots will communicate with both humans and other bots, and be given much autonomy; some are concerned about the type of information such agents will have access to and as well as their accountability. In response, the Joint Research Centre of the European Commission is using Semantic Web technology and the W3C’s Platform for Privacy Preferences protocol to create a test privacy protection agent. Observers have also expressed concerns about authenticating the veracity of content sources on the Semantic Web, while others are concerned that the vision of Berners-Lee, the creator of the Web, could create a dynamic self organizing Web. With self-adaptive intelligence, there are some concerns that the global brain would quickly evolve and that we would not be able to comprehend it. “The global communication network is already capable of complex behavior that defies the efforts of human experts to comprehend,” Daniel Dennett, director of the Center for Cognitive Studies at the University of Medford, Massachusetts, is quoted as saying in the June 2000 issue of New Scientist.
“Feds Spark High-End Computing Resurgence”
Federal Computer Week (12/02/02) Vol. 16, No. 42, P. 54; Robinson, Brian
Speakers at ACM’s recent Supercomputer 2002 conference said that federal agencies are focusing and investing in projects that could lead to renewed demand for supercomputers. Moreover, National Science Foundation (NSF) Director Rita Colwell emphasized that carrying out basic IT and applications research, not just ratcheting up computing power, will be a major theme of this push. Projects that the NSF is supporting include the Grid Physics Network, which sets up connections between American and European scientists and could foster the development of petascale virtual data grids; the Network for Earthquake Engineering Simulation; and the TeraGrid network designed to make the most sophisticated computing resources accessible to researchers nationwide. Meanwhile, Energy Department Secretary Spencer Abraham disclosed that IBM won a contract to construct what will be the two most powerful supercomputers on Earth: The 100-teraflop ASCI Purple, whose chief function will be nuclear weapons simulation, and the 360-teraflop Blue Gene/L system, which will be used to model biological systems, explosives behavior, atmospheric turbulence, and other phenomena. Another project receiving a lot of government attention is the Cray X1 supercomputer, which Cray CEO James Rottsolk said is part of an initiative to build a system that operates at sustained petaflop speeds. Its design is based upon a single architecture, unlike the cluster scheme the other systems are using, which the company and its proponents believe is less expensive in terms of total life cycle costs, and more suitable for advanced science and engineering problems. Officials of the Defense Advanced Research Projects Agency (DARPA) report that cluster systems will not have adequate computing power to run national security applications, and DARPA is emphasizing such properties as easy programmability, portability, reliability, and scalability.
“Took a Licking, Kept on Ticking”
IEEE Spectrum (12/02); Cherry, Steven M.
On Oct. 21, the Internet’s 13 root servers received a flood of useless information that was intended to crash the servers by boosting traffic levels to about 1000% above normal traffic. The four U.S.-based root servers–some of which are managed by VeriSign and Nominum–functioned well, while nine servers located overseas experienced disruption. One root server at the University of Maryland was inoperable for 20 minutes during the data flood. Root-server experts want to implement reforms before the next headline-grabbing attack occurs, and Microsoft has already issued patches for software vulnerabilities that attackers used to infiltrate numerous computers, which were then commanded to launch the Oct. 21 query attack against Internet root servers. Some experts want lower-level servers to conduct more frequent caching as a way to reduce exposure to this type of attack. The University of Maryland has two Internet server providers–Qwest and Uunet–and only Uunet responded quickly to University of Maryland strategic requests during the attack. Improving relations and coordination with ISPs is another issue root server operators should work on. “The root servers aren’t the best place to attack if you want to cost people money,” notes Nominum chief scientist Paul Mockapetris.
“England Tests E-Voting”
Government Technology (11/02) Vol. 15, No. 14, P. 10; Peterson, Shane
Nine jurisdictions in the United Kingdom tested remote electronic voting systems in May, and the results were generally satisfactory, according to Thomas Barry of the Office of the Deputy Prime Minister. The technologies used included interactive voice response (IVR), short message service (SMS)-enabled mobile devices and PC-based systems, and PCs and kiosks installed in public areas. The Electoral Commission’s Tom Hawthorn acknowledges that, “Remote voting won’t be for everybody, and what we’re looking at doing is make sure there are a range of options available so people can pick the one that best fits the way they live.” The goal of the project was to demonstrate the feasibility of multichannel voting, rather than boosting voter turnout. To ensure security, a list of eligible voters in each jurisdiction was compiled and sent to a vendor that generated a unique, 10-digit PIN for each voter that could be used online or over the telephone, notes Alan Winchcombe of the Swindon Borough Council. Voters who used remote PCs viewed their ballots on a Web site that supplied an audit trail by enabling them to print their vote confirmation page, Hawthorn says. The Electoral Commission estimated in August that 76.5% of voters in the five local jurisdictions offering multichannel voting cast nearly 50,000 votes through mail-in ballots and at polling places; roughly 14.6% voted online, 6.1% used IVR, and 2.7% used SMS-enabled mobile devices. Barry says that the central government is considering a pilot of interactive digital TV for the next round, while respondents to a survey of electronic voters found the technologies easy to use, but were uncertain as to their security.