MIT wins DARPAĆ¢€™s Great Red Balloon Hunt

Almost as soon as it was launched, in only nine hours in fact, the Defense Advanced Research Projects Agency (DARPA) announced that the MIT Red Balloon Challenge Team won the $40,000 cash prize in the DARPA Network Challenge, a competition that required participants to locate 10 large, red balloons at undisclosed locations across the United States. Bots, bombs and weird science: The wackiest stories of 2009 "The Challenge has captured the imagination of people around the world, is rich with scientific intrigue, and, we hope, is part of a growing 'renaissance of wonder' throughout the nation," said DARPA director,Dr. Regina E. Dugan in a statement. The MIT team received the prize for being the first to identify the locations of all 10 balloons. DARPA last month offered up the rather interesting challenge: find and plot 10 red weather balloons scattered at undisclosed locations across the country.

According ton the agency, the balloons were in readily accessible locations, visible from nearby roadways and accompanied by DARPA representatives. The first person or team to identify the location of all the balloons and enter them on the challenge Web site will win a $40,000 cash prize. All balloons are scheduled to go on display at all locations at 10:00AM (ET) until approximately 4:00 PM on Saturday, December 5, 2009. Should weather or technical difficulties arise with the launch, the display will be delayed until Sunday, December 6 or later, depending on conditions. Latitudes and longitudes are entered in degree-minute-second (DDD-MM-SS) format as explained on the website Coordinates must be entered with an error of less than one arc-minute to be accepted. If, for any reason, the balloon is displayed in one location then moved to a second location, either location will be accepted. 12 mad science projects that could shake the world Entrants were required to register and submit entries on the event website.

The DARPA Network Challenge is designed to mark the 40th anniversary of the Internet. "It is fitting for DARPA to announce this competition on the anniversary of the day that the first message was sent over the ARPANET, the precursor to the Internet," said Dr. Regina E. Dugan, who made the announcement at a conference celebrating the anniversary. "In the 40 years since this breakthrough, the Internet has become an integral part of society and the global economy. The Grand Challenge competitions were started in 2004 to foster the development of autonomous robotic vehicle technology for use on the battlefield. The DARPA Network Challenge explores the unprecedented ability of the Internet to bring people together to solve tough problems." This is the latest example of DARPA's interest in reaching nontraditional sources of ideas and talent. The competition model for stimulating technological development enabled significant strides that will someday keep our men and women in uniform out of harm's way. DARPA has held a number of challenges including one that featured robot cars and another that seeks to develop lunar spacecraft.

Alcatel integrating network layers for efficiency

Alcatel-Lucent on Wednesday set a course for tighter integration of the two main components of long-haul service-provider networks, saying it will help carriers streamline their infrastructure and run it more efficiently. Now, with the Converged Backbone Transformation Solution, it is leveraging its expertise in both technologies so the two can work more smoothly together and be managed more easily. The company is a major player in carrier optical transport and is gaining ground on Cisco Systems and Juniper in IP (Internet Protocol) routing, according to industry analysts.

The payoff for enterprises that rely on carriers to interconnect their offices could be both faster provisioning and lower prices, said Ray Mota of Synergy Research Group. The two domains have remained largely separate, but Alcatel said it will bring its IP and optical systems closer together, with more flexible capacity-handling and unified management. Most service-provider networks use electronic packet routers to direct Internet and private IP traffic, but also optical infrastructure to transport data over long distances. Today's IP and optical network elements effectively just hand off traffic to each other without much interaction, and they typically are managed by separate teams, said Lindsay Newell, vice president of marketing for IP at Alcatel. If you go to an optical vendor, you get an optical answer," Newell said.

His company is best equipped to make these systems work more closely together because it has experience making both parts, Newell said. "If you go to a router vendor, you get a router answer. Alcatel says it is skilled in both. Current routers from most vendors can map one router port to one wavelength of light for optical transport. One thing Alcatel aims to provide is a more granular way of feeding traffic from IP routers into optical infrastructure. Alcatel is introducing that technology, called IP over dense wave-division multiplexing, on its service routers now. Alcatel plans to offer the ability to send traffic from multiple ports or from multiple virtual LANs into a single wavelength, Newell said.

But IP over DWDM isn't ideal, because it wastes optical capacity if there isn't enough traffic from the IP port to fill the wavelength, Newell said. Carriers can use this to make more efficient use of each wavelength, so potentially they won't have to deploy or light up as many wavelengths, he said. The company will implement the capabilities using existing and emerging industry standards, adding some proprietary features of its own but keeping its products interoperable with gear from other vendors at a more basic level. This could save space and power in carrier facilities as well as money. Also through closer integration, Alcatel will allow IP routers to send traffic straight across the optical network, bypassing unnecessary IP routing along the way.

At a higher level, Alcatel said it can integrate the management of both network layers because it supplies both. This core router bypass capability will let traffic destined from, say, Los Angeles to New York go straight to its destination without going through an IP router in Chicago, Newell said. Among other things, the IP and optical management systems will know what resources are available on each and be able to communicate fault management alarms. The Converged Backbone Transformation Solution is a set of features that will roll out over time. Ultimately, the IP network elements will be able to reroute traffic if there's a failure in the optical layer, and vice versa. Immediately, Alcatel is delivering features including IP over DWDM on service routers and the initial elements of information exchange between IP and optical, such as common alarm views and fault isolation.

Later it will offer more dynamic interaction between the layers, including dynamic provisioning for failover, Newell said. Next year, the company plans to provide static provisioning for port-level and VLAN traffic grooming. The integration ultimately can save carriers at least 30 percent in capital expenditures on a network built from the ground up with the new technology, according to Newell. Many carriers are grappling with data traffic that is growing far faster than the revenue they can collect for it, and this type of streamlining approach could help them, Synergy's Mota said. Savings for carrier networks with a large amount of existing infrastructure will be more incremental, he said.

Med students' tweets, posts expose patient info

Future doctors are too frequently putting inappropriate postings and sometimes confidential patient information on social sites like Facebook and Twitter, according to a study published in the Journal of the American Medical Association. Thirteen percent reported that students had violated patient confidentiality in postings on social networking sites. The study shows that in a survey of medical colleges, 60% reported incidents of medical students' posting unprofessional content online.

The survey also showed that 39% of colleges found medical students posting pictures of themselves being intoxicated, and 38% reported medical students posting sexually suggestive material. Of the schools that reported finding inappropriate student content only, 67% said they gave informal warnings and 7% said they expelled the student. The study, published this week, surveyed deans or their counterparts at 78 U.S. medical colleges. People are frequently warned that photos and posts, and even comments from friends and family - on sites like Facebook, MySpace and Twitter could come back to haunt them. Dan Olds, an analyst with The Gabriel Consulting Group, said people who post inappropriate material, such as pictures of themselves drunk, has long been a downside of social networking.

Companies report that they check social networking sites before hiring a prospective employee, and an off-hand comment about a work project or annoying colleague can easily come back to bite someone in the office. However, when health care workers are involved in such activity, it takes on a new dimension. "Doctors are in a bit of a unique position in society - almost universally trusted by patients to hold some of their most personal information confidential," Olds said. "This relationship needs to exist, because if patients hold back information from their doctor, it can have a serious impact on their lives. And it's hard to believe that medical students, folks who are highly educated, are so stupid as to not see the downside of these social networking activities." He added that aside from posting patient information online, it's also a bad idea for medical students to post pictures of the drunken party they were at the night before or information about their latest tryst. "Even though this was probably done innocently and with no bad intent, the potential for damage to patients is large," Olds said. "Seeing their doctors partying and drunk is not the way to engender trust, particularly if you're the person who has an appointment with that doctor the next day." If patients believe their doctors are unintentionally, or, worse yet, intentionally, revealing confidential information, then that trust will be irreparable damaged.

Ellison: Fusion Applications in 2010

Oracle plans to launch its long-awaited Fusion Applications in 2010, and they will be deployable both on-premises and as SaaS (software as a service), CEO Larry Ellison said Wednesday during a keynote address at the OpenWorld conference in San Francisco. Oracle has placed special emphasis on improving the user experience with Fusion, as well as embedded BI (business intelligence) throughout the applications, Ellison said. Fusion Applications, which Oracle first announced several years ago, will combine the best elements of Oracle's various business software product lines into a next-generation suite.

Ellison's keynote contained the most specific information the company has provided about Fusion Applications since first announcing the project several years ago. We're absolutely committed to do that," he said to applause. "We can afford to not only maintain the software you're running today, but also build the software you may want to move to tomorrow." Ellison did not provide details regarding licensing and pricing models, including whether Oracle will sell the new applications via subscription, as is the norm with SaaS. But Oracle is nonetheless ensuring the products are ready for SaaS, including by developing monitoring tools that will track their performance, Ellison said. The CEO took pains to tell the packed room of Siebel, JD Edwards and E-Business Suite users that Oracle has no plans to abandon the product lines anytime soon. "Oracle will continue to enhance those applications for the next decade and beyond. While SaaS vendors provide users with service-level agreement guarantees, "there aren't very good tools for figuring out whether you're actually getting the service levels you're paying for," he said. This gives Oracle "a huge advantage" because the SOA model will allow users easily to tie together "the Fusion generation and all the stuff you have deployed today," Ellison said. "We don't think all customers are going to replace what they have today with Fusion," he added. "We think they will augment what they have with some Fusion. Oracle's tools will enable it to "not only contractually commit but prove we're delivering the service levels." Fusion Applications are based on a SOA (service oriented architecture) provided by Oracle's Fusion Middleware stack, Ellison said.

Fusion is designed to be delivered that way. ... We have replacement applications and then we have net-new applications." The initial suite will include modules for financial management, human capital management, sales and marketing, supply chain management, project management, procurement management and GRC (governance, risk and compliance), but other key areas, such as manufacturing, will come later. Oracle has worked "very, very closely" with customers to design and test Fusion Applications, work that has resulted in a superior user interface, Ellison said. Ellison stressed the benefits of the modular approach. "You assemble the components in the order you want to use them, in the order that makes sense for your industry," he said. Embedded BI is another major focus of the suite. "You can't use the system without using business intelligence," Ellison said. The application allowed the user to bring up a dashboard showing which order manager was responsible for the particular transaction, and then begin an instant-messaging conversation with him directly from the tool. In a demonstration, a pair of Oracle executives showed how the system alerted one user that a particular shipment had been delayed.

In turn, the order manager was able to search for less critical orders and reroute them to fulfill the first one. "We tell you what you need to know, what you need to do, and we tell you how to do it," Ellison said. While Oracle "definitely has the capability to deliver this as SaaS, it's really up to them to figure out if they want to enter [that market] large-scale," Wang added. Ellison's presentation proved that "Fusion apps are real," said Ray Wang, a partner with the analyst firm Altimeter Group. In some product areas, such as talent management, "they can't compete without the SaaS option," he said. In a presentation Tuesday, on-demand CRM (customer relationship management) vendor and Oracle rival Salesforce.com compared multitenancy to an office building, where individual tenants share the overall infrastructure but customize their office spaces.

SaaS applications are different from straight application hosting, because they use a "multitenant" architecture wherein customers share a single instance of an application but their data is kept private from other customers. Oracle "will definitely" offer a hosted version of Fusion Applications, although it remains to be seen exactly how their SaaS strategy for the software plays out, Wang said. They're playing catch-up." Meanwhile, the work ahead of companies looking to adopt Fusion Applications sooner rather than later is "not trivial," said Floyd Teter, head of the Oracle Applications Users Group's Fusion Council, which has been educating group members about the upcoming applications release. When Fusion Applications arrive, they will also raise the competitive stakes between Oracle and its main rival, SAP. But SAP spokesman Saswato Das dismissed Oracle's announcement. "Basically, our Business Suite 7 is the most comprehensive and flexible suite of applications on the market," Das said. "Oracle has been talking about Fusion for a long time, and our suite is available now. One key step customers should take is to catalogue their application customizations and determine which ones could be retired, Teter said. "A lot of us have done a lot of custom things. The skill set now is more Java and specifically [Java Enterprise Edition]. You also better have some knowledge of JavaScript." In addition, Fusion Applications rely on Oracle's JDeveloper IDE (integrated development environment), rather than other Java development tools like Eclipse.

If you're a long-term Oracle customer, it's easy to lose track." Fusion Applications will also require some companies to acquire new development skills, Teter said. "A lot of us run a lot of customizations through MOD PL_SQL. That's going to be gone. For many companies, there will be plenty of time to plan, since the first version of Fusion Applications won't include certain functional areas. In the meantime, we'll continue to stay current on EBS." But Teter said the vendor's work on Fusion has produced impressive results, particularly in regards to user experience. The lack of manufacturing has prompted the Jet Propulsion Laboratory at the California Institute of Technology, which uses E-Business Suite, to wait for a future version, said Teter, who is a project manager at the lab. "When I get a full-functionality replacement, we'll look at it. Earlier in his keynote, Ellison turned to Oracle's recently announced Exadata 2 appliance for data warehousing and transaction processing. Exadata 2 uses Sun hardware, while the original machine, announced at last year's OpenWorld show, used Hewlett-Packard iron.

He claimed the machine widely outperforms and is much less expensive than competing technologies, such as from IBM, calling it "the fastest computer that has ever been built to run data warehousing applications." "This system will outperform any of the competition," he said. Oracle is in the process of buying Sun Microsystems but the deal is on hold while European officials conduct an antitrust review. Ellison temporarily ceded the stage to California Gov. Ellison didn't discuss the acquisition during his keynote, but Sun and its officials have played an active role in this year's OpenWorld conference. Arnold Schwarzenegger, who delivered a joke-peppered talk espousing the value of technology, from biotech to the Hollywood special effects that powered his long career as an action star. "Think of Conan the Barbarian fighting the giant snake," he said, referring to his role in the 1982 film based on Robert E. Howard's tales of a legendary warrior king. "I never could have done that and look so studly without technology," he said to an eruption of laughter from the crowd.

Schwarzenegger also congratulated Ellison and Sun chairman Scott McNealy on the pending acquisition, stressing the companies' importance to California's economy. "Working together, I know the sky is the limit for you and your employees," he said.

Amazon takes Kindle global, lowers price

Amazon plans to start selling its Kindle reader in over 100 countries and territories on Oct. 19, and the company has already started booking pre-orders for the device on its web site. The Kindle was earlier only available in the U.S. Amazon has been working with publishers for many months to build a vast selection of English language books available around the world, said Stephanie Mantello Ward, a spokeswoman for Amazon, in an e-mail on Wednesday. Amazon is selling the Kindle with U.S. & International Wireless to customers in Asia, Africa, Europe, Australia and South America, for US$279 for a reader with a 6-inch display and the ability to wirelessly download books and other content globally, the company said on Wednesday. Amazon also has some books in languages other than English, but its focus right now is to provide its customers with the best possible experience for English-language content, she added.

She did not give a reason why China was not included in the current launch. The Kindle will not be sold in China because Amazon is unable to ship the Kindle or offer Kindle content to Chinese customers, the company said. "We want to ship Kindle everywhere, and we're working on it," Ward said. Best-seller books will cost $11.99 or more for international customers, with about 100,000 other titles available for less than $5.99, Amazon said. Amazon also said that it is lowering the price of the 6-inch display Kindle in the U.S. from $299 to $259. The Kindle DX, which has a 9.7-inch display, retails for $489. The Kindle with U.S. & International Wireless is 0.36 inches thick and weighs just over 10 ounces, Amazon said. These prices are higher than in the U.S. where most best sellers cost $9.99. As Amazon is selling from the U.S. store, the prices of the books in international markets are denominated in U.S. dollars rather than local currencies. "We aren't announcing a timeline today for payments in other currencies," Ward said. Its 2GB of memory holds up to 1,500 books.

Customers who buy a Kindle in a country without Whispernet coverage will be able to purchase content from the Kindle Store through a PC and download it to their Kindle through a USB (universal serial bus) cable, Ward said. The device features an experimental text-to-speech feature, Amazon added. Whispernet is a wireless delivery system for the Kindle that allows a user to download books and other Kindle content. AT&T will be offering the service to international users of the Kindle.

Flash flaw puts most sites, users at risk, say researchers

Hackers can exploit a flaw in Adobe's Flash to compromise nearly every Web site that allows users to upload content, including Google's Gmail, then launch silent attacks on visitors to those sites, security researchers said today. "The magnitude of this is huge," said Mike Murray, the chief information security officer at Orlando, Fla.-based Foreground Security. "Any site that allows user-uploadable content is vulnerable, and most are not configured to prevent this." The problem lies in the Flash ActionScript same-origin policy, which is designed to limit a Flash object's access to other content only from the domain it originated from, added Mike Bailey, a senior security researcher at Foreground. How many of those sites serve files back to users from the same domain as the rest of the application? Unfortunately, said Bailey, if an attacker can deposit a malicious Flash object on a Web site - through its user-generated content capabilities, which typically allow people to upload files to the site or service - they can execute malicious scripts in the context of that domain. "This is a frighteningly bad thing," Bailey said. "How many Web sites allow users to upload files of some sort?

Nearly every one of them is vulnerable." Bailey, who demonstrated how attackers could compromise a Web site and attack users in a post today on Foreground's blog , outlined how a hacker would leverage the Flash flaw. "It's relatively simple," he maintained. "All they need to do is create a malicious Flash object, and upload it to the [Web] server." He used the example of a company that lets users upload content to a message forum to explain the process. "If the user forum lets people upload an image for their avatar, someone could upload a malicious Flash file that looks like an avatar image," Bailey said. "Anyone who then views that avatar would be vulnerable to attack." Adobe has told Foreground that the flaw is "unpatchable," Murray and Bailey said. But they've not had much success. "Some of the big Web properties have figured this out," said Bailey. "In a lot of cases, they're hosting user-generated content on another domain, perhaps for performance reasons." Among those site and services that have locked down their servers, Foreground cited Microsoft's Windows Live Hotmail and Google's YouTube. "But very few system administrators are even aware of this," Bailey added. Instead, Adobe is trying to educate site administrators to close the hole on their end. Even some of Adobe's Web properties are vulnerable to such an attack. "How can Adobe expect others to protect themselves when they can't do it themselves?" asked Murray. The only current defense users can employ against such attacks is to stop using Flash, or failing that, restrict its use to sites known to be safe with tools such as the NoScript add-on for Mozilla's Firefox, or ToggleFlash for Microsoft's Internet Explorer. "The best mitigation is to not use Flash," argued Murray, "but we know that that's impossible for most users, since Flash is so widely used on the Web." "Almost everyone using the Internet is vulnerable to a Web site that allows content to be updated inappropriately," said Murray. "That's not hyperbole, it's just fact.

Google's Gmail is also at risk from malicious Flash attack - Gmail lets users upload and download file attachments - although Bailey said that exploiting Google's Web mail service would be "extremely tricky" with "lots of hoops to jump through." Although Foreground has not detected any in-the-wild attacks using the technique, Murray said that there's evidence hackers are moving toward such tactics. "We're starting to see Flash used in these ways," he said, and cited a recent worm that leveraged a similar vulnerability in Adobe's software, which is pervasive on the Web and on users' machines. "The worst-case scenario is that someone would figure this out, and launch silent attacks against the entire Internet." That fear was a major consideration in Foreground's decision to go public with its information, even though Adobe can't fix the problem with a global patch of some sort. "We went back and forth on this a whole lot," said Murray. This has the potential to affect any social media site, any career site, any dating site, many retail sites and many cloud applications. End users would never know they got exploited." Adobe was not immediately available for comment. That's why this attack is so serious.

Mozilla augments Firefox's plug-in check

As it promised, Mozilla has created a page that checks for outdated plug-ins used by Firefox. Links to the plug-in download pages are also available so that users can obtain the most current versions of software from the likes of Adobe, Microsoft , Sun and Apple . A month ago, Mozilla debuted its plug-in checking with updates to Firefox 3.5.3 and Firefox 3.0.14, which automatically detected outdated versions of Adobe's Flash Player and prompted users to upgrade to the newest - and theoretically the most secure - edition of the browser plug-in. Mozilla is testing the page , which pings the company servers, then returns a list of plug-ins, noting those that are up-to-date and ones that should be updated.

At the time, Mozilla said it would follow that initial move with others, including the publishing of a page that Firefox users could visit to check the status of other vendors' plug-ins. According to Blair McBride, the Mozilla developer who announced the test plug-in check page on his blog today, Mozilla will add a built-in plug-in checking feature to Firefox 3.6, which is slated for release before the end of the year. "Firefox 3.6 will have this integrated to make sure users know when they have an outdated plug-in, without having to manually visit the Plugin Check page," said McBride. "Whenever you load a page that uses a plug-in that is out of date, you'll get [a] warning." The warning will read: "Some plugins used by this page are out of date." A button will be available to update the plug-in. Two weeks after introducing the plug-in check, Mozilla said it had convinced 10 million Firefox users to go to Adobe's site and grab the latest version of Flash. Firefox 3.6 will also sport changes to the browser's add-on too. "The Plugins tab of the Extension Manager (Tools/Add-ons) will indicate which of your plugins are out of date," promised McBride. Security experts have commended Mozilla on the moves.

The changes are to appear in the first, and likely only, beta of Firefox 3.6; that preview is slated to ship next week under Mozilla's published schedule. Last month, Wolfgang Kandek, the chief technology officer at security firm Qualys, called it "a great way to improve the security of Web browsers," as he noted that vulnerabilities in Adobe's Flash are frequently targeted by hackers. Earlier Tuesday, Mac users running Snow Leopard were told that QuickTime was out of date, even though they had installed Mac OS X 10.6.1, Apple's sole security update so far for the new OS. The problem, according to a post on Bugzilla , Mozilla's bug- and change-tracking database, was that Snow Leopard dubbed its version of QuickTime as 7.6.3, while its predecessor Leopard tagged it as 7.6.4. Later Tuesday, Mozilla had implemented a fix to the plug-in checking page to accurately report QuickTime 7.6.3 as the most up-to-date edition for Snow Leopard users. Mozilla has already caught one bug in the plug-in test page.

Microsoft greasing Windows 7 skids with early release of desktop tools

With the hope of sparking Windows 7 upgrades, Microsoft is planning an early release of its suite of desktop deployment tools.  The tools were originally slated to ship in early 2010, but Microsoft hopes to give customers the software in late October for use in rollouts of Windows 7 across corporate desktops. The news of the early release was announced by Ran Oelgiesser, senior product manager for MED-V, on the MDOP blog. The catch is that the Microsoft Desktop Optimization Pack (MDOP) R2 2009 is only available to volume licensing customers with Software Assurance contracts.

Slideshow: Snow Leopard vs. All the tools in MDOP R2 2009 will include support for Windows 7 except MED-V. Support for the new OS in MED-V 1.0 SP1 will come early in 2010, wrote Oelgiesser. Windows 7 Windows 7 is slated to ship to commercial customers on Oct. 22, but corporate users with volume licensing contracts have had access to Windows 7 since last month. MED-V runs multiple versions of Windows or applications concurrently without having to open multiple virtual machine sessions. The suite is a major part of Microsoft 's Optimized Desktop strategy, which addresses centralized management and deployment of physical and virtual resources. The software complements another MDOP tool called App-V, which is used for managing and deploying virtual PCs. The MDOP lineup also includes Asset Inventory Service; System Center Desktop Error Monitoring; Advanced Group Policy Management (AGPM) for change management via group policy objects; and the Diagnostics and Recovery Toolset, which helps in recovering a crashed PC. MDOP is composed of software from Microsoft's purchases of Softricity, Kidaro, AssetMetrix, Winternals Software and DesktopStandard.

According to Oelgiesser, App-V 4.5 SP1 will have various integration points with 32-bit versions of Windows 7, including with the AppLocker, Branch Cache and BitLocker ToGo features. The 64-bit version, App-V 4.6 will be available in the first half of 2010. Advanced Group Policy Management 4.0 features two new capabilities targeted at Windows 7. One allows users to manage group policies across different domains, and the other provides new search and filtering to ease tracking of group policy objects. In addition, the software will support 32-bit version of XP, Vista and Windows Server. Follow John Fontana on Twitter 

US company burned by China Web filter plans rival product

A U.S. company whose software code was allegedly stolen in China by a controversial, government-backed Internet filtering program will hit back by launching a rival product for a low price in China, the company said late Sunday. The Solid Oak program, called CyberSitter and targeted at parents, will be offered in languages including Chinese in a version due out next month. Solid Oak Software, which has said its code was copied in a program that China ordered be bundled with all new PCs, is exploring ways to offer its own Web filter for free or at a very low price in China, company President Brian Milburn, said in an e-mail.

A Chinese version of the product would compete with Green Dam Youth Escort, the program that Solid Oak says copied its code and that China originally ordered PC makers to include with all new computers sold in the country from July this year. But under heavy pressure from foreign PC makers and the U.S. government, China indefinitely postponed the mandate just hours before it was set to take effect. The Chinese government had paid the program's developers to allow all PC buyers to use the software for free for one year. Major PC makers including Lenovo and Acer began bundling Green Dam with new PCs until this month. The program also used blacklists apparently lifted from Solid Oak's software, according to the company and a group of U.S. researchers.

The program, which China said was meant to protect children from online pornography, was also found to block politically sensitive material such as negative references to a former Chinese president. One file found in the Chinese program contained an encrypted version of a years-old Solid Oak news bulletin, according to the researchers. Green Dam came under fire for concerns about system stability in addition to user privacy and freedom of speech. Solid Oak, which is based in Santa Barbara, California, is preparing legal action against PC makers that shipped Green Dam, though an update to the program in June removed some of the allegedly infringing elements. One Beijing high school recently removed the program from its computers after finding that it conflicted with software used for grading and attendance tracking.

Bryan Zhang, general manager of Jinhui Computer System Engineering, one of the designers of the Chinese software, declined to comment on the allegations of code theft. Green Dam "is a conglomeration of whatever components [the developers] managed to steal ... or otherwise appropriate from various sources, and duct tape together in the form of an alleged piece of software," Milburn wrote in his e-mail. "They should be utterly humiliated, not just because they stole much of the core functionality, but even more so because they intentionally inflicted such a miserable product on a population of innocent computer users," Milburn wrote. The new Solid Oak product will have a Chinese user interface available and a filtering function that the company reworked after much of its old proprietary code appeared online. That is the ultimate goal," company spokeswoman Jenna DiPasquale said in an e-mail. The filtering will be entirely URL-based, avoiding the need to translate keywords into Chinese. "We are working on a way to release it for free.

Perot wins key health-care IT outsourcing deal in India

Perot Systems has bagged a 10-year IT outsourcing contract in India, its first outside the U.S. The win reflects Perot's bid to grow its health-care business in markets other than the U.S., as well as in emerging markets like India, China, Brazil, and Mexico, company executives said on Friday. But only 4.1 percent of the company's revenue from the health-care industry was from outside the U.S., up from 2.5 percent two years ago, said Kevin Fickenscher, executive vice president for International Healthcare at Perot, in a telephone interview. In the second quarter, 48 percent of Perot's revenue came from the health-care industry.

Expansion outside the U.S. is a key focus area for Perot, said Raj Asava, Perot's chief strategy officer. The maturing health-care industry in these emerging markets has a big appetite and also funds to invest in technologies such as electronic health records and clinical information systems, Asava said. For its health-care business, the company is targeting emerging markets in the Middle East, China, India, and Latin America, besides more mature markets such as the U.K. and Germany. The contract with Max Healthcare, a large hospital chain in India, has an initial value of US$18 million, but could go up in value as more applications and services are added, Perot said. The deployment will be around the open source VistA (Veterans Health Information Systems and Technology Architecture) electronic health record and health information system, he added.

Besides running the applications already installed at Max, Perot will also deploy an electronic health records system and other IT infrastructure, Fickenscher said. Perot already has a services subsidiary in India with about 9,000 staff that offer outsourcing services to customers in the U.S., Europe, and other parts of the world. Multinational and Indian service providers are targeting India's growing services market, including in the telecommunications sector where a number of mobile service providers are outsourcing their IT infrastructure. About 60 percent of these staff do work for the health-care industry. The immediate opportunity for vendors of IT targeting the health-care industry is from private sector providers, but government run hospitals will soon follow, Fickenscher said.

Fugitive hacker headed back to U.S. for arraignment

A Miami man who for three years had evaded prosecution in connection with the theft and reselling of VoIP services is being extradited to Newark from Mexico today and is set to be arraigned in a New jersey federal courthouse on Friday. He had been free on $100,000 bail. Edwin Pena, 26, had been arrested in June, 2006, on multiple computer and wire fraud charges, and then allegedly fled the country about two months later.

Pena was apprehended in Mexico in February and federal prosecutors have been working to get him extradited back to the U.S. since then, according to Assistant U.S. Attorney Erez Liebermann . "He's been a fugitive for over three years," said Liebermann, who is prosecuting the case. "We're looking forward to proceeding with the prosecution." Pena faces 20 charges that include conspiracy to commit computer intrusion and conspiracy to commit wire fraud charge. According to a criminal complaint filed in U.S. District Court in New Jersey, Pena and co-conspirator Robert Moore of Spokane, Wash., sold more than 10 million minutes of VoIP service that had been stolen from 15 telecommunications providers. The U.S. alleges that from November 2004 to May 2006 Pena and a cohort hacked into the computer networks of VoIP service providers and routed calls made by customers of Pena's VoIP service through them. Prosecutors have contended that the lost minutes were valed at $1.4 million to the providers victimized in the alleged scam. In the fall of 2007, Moore pleaded guilty to conspiracy to commit computer fraud and began a two-year prison sentence. Federal investigators contend that Pena was the mastermind behind the scheme and Moore hacked the systems.

Voice-over-IP systems route telephone calls over the Internet or other IP-based networks. The complaint alleges that once Moore found unsecured networks, he would then e-mail Pena the key information needed to access vulnerable networks. Moore scanned telecommunications company networks around the world, searching for unsecured ports - the criminal complaint said that between June 2005 and October 2005, Moore ran more than 6 million scans of network ports within the AT&T network alone. Once the networks were accessed, prosecutors allege that Pena ran brute force attacks to find the proprietary codes needed to identify and accept authorized calls coming into the networks. According to court documents, Pena gained more than $1 million from the scheme. He allegedly would used the codes to surreptitiously route his clients' calls through the systems.

Some was spent to buy real estate in Miami, a 40-foot boat and luxury cars, including a BMW M3 and a Cadillac Escalade.

Meet Nook, Barnes & Noble's e-book reader

On Tuesday, Barnes & Noble announced that the Nook, the company's e-book reader that aims to compete with Amazon's Kindle, is available for pre-order. That ancillary screen is used to navigate books via a Cover Flow-like interface, display an on-screen keyboard, and generally operate the device. It's a very interesting device: the first dedicated e-book reader that is powered by Google's Android operating system (it runs Android 1.5). The Nook should ship at the end of November and it'll cost you $259. That's the same price as the Kindle 2, though an international Kindle 2 that allows wireless access outside of the U.S. costs $279. (The nook doesn't include an International option at the moment.) Barnes & Noble's reader has a 6-inch diagonal E Ink display, just like the Kindle 2, but the clever folks at B&N have also added a 3.5-inch color LCD screen below the E Ink screen. The Nook comes with 2GB of internal memory, which Barnes & Noble says will hold about 1500 e-books, though that can be expanded by using the included Micro SD slot.

And should you wish, you can remove the Nook's battery, for fun and profit-and B&N will sell you an extra battery if the 10-day charge without using wireless isn't enough for you. You can even listen to MP3s on the nook, either through the built-in mono speaker or by plugging in headphones. The Nook, again much like the Kindle, comes bundles with wireless 3G access-via AT&T, while the Kindle uses Sprint's network-so you can download content wirelessly. Free samples of all titles will be available and users will be able to access special content when using their nook at a Barnes & Noble store. The Nook ups the wireless ante by also including Wi-Fi connectivity (802.11 b/g) and access to free Wi-Fi in all of Barnes & Noble's stores (which is a very good idea, though it doesn't appear that the Nook has a Web browser, as the Kindle does). An e-reader isn't of much interest without something to read on it, and Barnes & Noble boasts more than a million titles, though many of those are through a partnership with Google to distribute public domain titles; there are newspapers and magazines available as well. You can also read your own PDFs on the Nook, something you can't do with a Kindle 2 without converting the PDF first.

They will be able to read it on their Nook, or using the Barnes & Noble e-reader available for PCs, Macs, the iPhone, some Motorola smartphones, and the BlackBerry. One of the biggest differences between the Nook and Amazon's Kindle is that you can let your friends borrow a Nook book for up to 14 days. You can also start reading a book on your Nook, and then keep reading where you left off on your Mac or PC thanks to Barnes & Noble's Reading Now technology, which sounds very much like Amazon's WhisperSync feature. If you want to play around with a Nook in person, you'll be able to do so at any of Barnes & Noble's physical stores, thanks to special Nook displays that should be popping up in the coming weeks.

The other iPhone lie: VPN policy support

It turns out that Apple's iPhone 3.1 OS fix of a serious security issue - falsely reporting to Exchange servers that pre-3G S iPhones and iPod Touches had on-device encryption - wasn't the first such policy falsehood that Apple has quietly fixed in an OS upgrade. Before that update, the iPhone falsely reported its adherence to VPN policies, specifically those that confirm the device is not saving the VPN password (so users are forced to enter it manually). Until the iPhone 3.0 OS update, users could save VPN passwords on their Apple devices, yet the iPhone OS would report to the VPN server that the passwords were not being saved. It fixed a similar lie in its June iPhone OS 3.0 update.

The fact of the iPhones' false reporting of their adherence to Exchange and VPN policies has caused some organizations to revoke or suspend plans for iPhone support, several readers who did not want their names or agencies identified told InfoWorld. Worse, it revealed that Apple's iconic devices have been unknowingly violating such policies for more than a year. "My guess is the original decision to emulate hardware encryption was made at a level where there wasn't much awareness of enterprise IT standards. One reader at a large government agency describes the IT leader there as "being bitten by the change," after taking a risk to support the popular devices. "I guess we will all have to start distrusting Apple," said another reader at a different agency. [ Apple's snafu on the iPhone OS's policy adherence could kill the iPhone's chances of ever being trusted again by IT, argues InfoWorld's Galen Gruman. ] Last week's iPhone OS 3.1 update began correctly reporting the on-device encryption and VPN password-saving status when queried by Exchange and VPN policy servers, which made thousands of iPhones noncompliant with those policies and thus blocked from their networks. (Only the new iPhone 3G S has on-device encryption.) Apple's document on the iPhone OS 3.1 update's security changes neglected to mention this fix, catching users and IT administrators off-guard. After all, this is a foreign language for Apple," says Ezra Gottheil, an analyst at Technology Business Research. "However, once the company realized the problem, it made a spectacularly dumb choice. Instead, it allowed itself to be seen in the worst possible light.

The change was necessary and inevitable, but Apple could have earned some points by coming clean at the earliest opportunity. This is the result of a colossal clash of cultures. Even when it is trying, Apple cannot force itself to think like an enterprise vendor." Apple's advice to users on addressing the Exchange encryption policy issue is to either remove that policy requirement for iPhone users or replace users' devices with the iPhone 3G S. IT organizations can also consider using third-party mobile management tools that enforce security and compliance policies; several now support the iPhone to varying degrees, including those from Good Technology, MobileIron, and Zenprise.

CIA endorses cloud computing, but only internally

WASHINGTON - One of the U.S. government's strongest advocates of cloud computing is also one of its most secretive operations: the Central Intelligence Agency. Jill Tummler Singer, the CIA's deputy CIO, says that she sees enormous benefits to a cloud approach. But the CIA has adopted cloud computing in a big way, and the agency believes that the cloud approach makes IT environments more flexible and secure.

And while the CIA has been moving steadily to build a cloud-friendly infrastructure - it has adopted virtualization, among other things - cloud computing is still a relatively new idea among federal agencies. "Cloud computing as a term really didn't hit our vocabulary until a year ago," said Singer. For example, a cloud approach could bolster security , in part, because it entails the use of a standards-based environment that reduces complexity and allows faster deployment of patches. "By keeping the cloud inside your firewalls, you can focus your strongest intrusion-detection and -prevention sensors on your perimeter, thus gaining significant advantage over the most common attack vector, the Internet," said Singer. But now that the CIA is building an internal cloud, Singer sees numerous benefits. Moreover, everything in a cloud environment is built on common approaches. But there are limits.

That includes security, meaning there's a "consistent approach to assuring the identity, the access and the audit of individuals and systems," said Singer. The agency isn't using a Google model and "striking" data across all its servers; instead, data is kept in private enclaves protected by encryption, security and audits. And it has virtualized storage, protecting itself "against a physical intruder that might be intent on taking your server or your equipment out of the data center," said Singer. The CIA uses mostly Web-based applications and thin clients , reducing the need to administer and secure individual workstations. Speaking at Sys-Con Media's GovIT Expo conference today, Singer not only provided a rare glimpse into the IT approaches used by the agency, but also talked about one of its greatest challenges: the cultural change cloud environments bring to IT. A move to cloud environments "does engender and produce very real human fear that 'I'm going to lose my job,'" she said.

Barry Lynn, the chairman and CEO of cloud computing provider 3tera Inc. in Aliso Viejo, Calif., said a typical environment may have one systems administrator for every 75 physical servers. In practice, highly virtualized environments reduce the need for hardware administration and, consequently, for system administrators. In contrast, a cloud-based environment may have just one administrator for every 500 servers or more. Federal CIO Vivek Kundra is encouraging agencies to adopt cloud computing, and he recently opened an online apps store that enables federal agencies to buy cloud-based services from Google, Salesforce.com and other vendors. The CIA has "seen a significant amount of pushback, slow-rolling [and] big-process engineering efforts to try to build another human-intensive process on top of enterprise cloud computing," said Singer. "It will take us a good long while to break that." One thing the agency will do to address resistance will be to base contract competitions on performance, not head count, "where it's to [a service provider's] benefit to do the work with fewer bodies and make more profit for their company," said Singer.

That's something the CIA will not do; its data will remain within the agency's firewalls, said Singer. Obstacles to the adoption of cloud computing, including concerns about security and loss of data control, may slow momentum, but "I think we'll see broader adoption and higher spending after the administration makes progress in some of the pilot programs it has planned," said Peterson. Government market research firm Input has revised its forecast for federal cloud-related spending upward; it now expects the government's cloud expenditures to grow from $363 million this year to $1.2 billion by 2014. "I think this is probably a conservative estimate, considering the push from the administration," said Deniece Peterson, an analyst at Reston, Va.-based Input. Singer said the CIA's IT department was moving in the direction of cloud computing, even if it wasn't using that term, when it widely deployed virtualization technology. Abstracting the operating system and software from the hardware "is the foundation of the cloud," Singer said. "We were headed to an enterprise cloud all along."

Microsoft pulls covers off Project 2010

Calling it the "most significant" upgrade in a decade, Microsoft Corp. today revealed details for its project management application, Project 2010. The new version will include tighter integration with Microsoft Outlook e-mail and new collaboration features for buyers of the mid-range Project Professional version. Final release is due by the first half of next year, about the same time Microsoft Office 2010 is released, according to Chris Capossela, senior vice president of the Microsoft Business Division. "This is the most significant release in more than 10 years," he said. Project 2010 will also be the latest Microsoft application to adopt the Office "Fluent" user interface, better-known as the controversial "Ribbon." Finally, Microsoft is cutting the number of versions of Project from four to three, injecting portfolio management capabilities from the short-lived Project Portfolio Server into Project Server 2010. Microsoft is making the public beta of Project 2010 available later this fall, though it is taking sign-ups today.

Microsoft made the announcement during its Microsoft Project conference, which is taking place in Phoenix this week. Though relatively unknown, Project is used by about 20 million workers, according to Microsoft. Capossela, who was general manager for the Project product earlier this decade, is giving a keynote speech at the conference. It is one of the company's 10 largest revenue generators, Capossela said. "It's a wonderful, quieter business," he said. One new feature is user-controlled scheduling, which is aimed at users who don't require rigid, automatically generated deadlines.

For 2010, Microsoft will offer Project Desktop, Project Professional and Project Server. Another feature lets managers whose companies have Project Server assign tasks to employees and then lets them respond and provide updates, fill out timesheets, all through their Outlook e-mail client. Another example are collaboration abilities, such as viewing task lists for groups and timeline views of multi-worker projects that are part of the Project Professional application. Those workers must be covered by Client Access Licenses (CALs) for Project Server, but do not need to have purchased Project Desktop. "We're trying to widen the funnel down to team participants," Capossela said. This data is hosted on any version of SharePoint, including the free version known as SharePoint Foundation. Microsoft is also changing the Project user interface to Office 2007's "Ribbon." Capossela said the move was necessary because of the many Project commands, which he said "numbered in the thousands."

Previoulsy, such features required users to buy the higher-end Project Server.

HP upgrades Unix platform with data protection

HP this week unveiled updates to its HP-UX Unix OS and Serviceguard high-availability software, offering capabilities in data protection, data privacy, and business continuity. The software packages run on HP Integrity and HP 900 servers. [ Check out InfoWorld's report on how HP has been looking to lure Sun Solaris Unix users to HP-UX. ] The Unix upgrade offers automated features to reduce maintenance requirements, improve availability, and enhance security, the company said. Update 5 of HP-UX 11i v3 and Serviceguard restore application services in the event of hardware or software failure, HP said.

Users can lower operational costs and increase efficiency in such demanding applications as online transaction processing or business intelligence, according to HP. "Comprehensive" data protection is provided through encryption for data in transit and at rest, HP said. Update 5 provides as much as 99 percent of raw disk performance, enabling reduction in operational costs for large databases and accelerated access to business-critical information. Enhanced data privacy is provided through Bastille, an automated system-hardening tool that configures a system to protect against unauthorized access. Administrator productivity is improved with expanded security bulletin analysis and patch maintenance. Business continuity is improved through minimization of downtime in the OS's Logical Volume Manager. Security issues are identified for as many as 100 systems in a single view when integrated with HP System Insight Manager.

Simplified standards compliance is offered through PCI (Payment Card Industry) and Sarbanes-Oxley Act report templates HP Serviceguard, which is part of the HP Virtual Server Environment software suite, is integrated with HP-UX 11i to protect applications from down time, HP said. Another improvement is elimination of business interruptions with Online Package Maintenance capabilities that run routine maintenance and upgrades while the system is active. Business connectivity is enabled during Serviceguard upgrades through a Dynamic Root Disk tool that reduces server network down time by 75 percent, the company said. Management of server connections is improved with a graphical cluster topology map for administration and configuration.  Also, traffic is coordinated between clustered servers and storage arrays.

Quick actions help financial firm avoid security disaster

While most of the IT world has been spared a devastating security attack like Blaster and Sasser for the last few years, the damage wrought by all manner lesser-known computer viruses continues to inflict corporate pain. 10 of the Worst Moments in Network Security History For example, New York City-based investment firm Maxim Group, faced a security ordeal this year when a virus outbreak pummeled the company's Windows-based desktop computers and servers. "On early April 15th, a few people called to say they were having problems with their computers," relates John Michaels, CTO there in describing how the investment firm's IT staff started to get an inkling that morning that something was terribly wrong. "After looking into it, we knew something bad was happening, affecting all our users, and my servers." Malware was disabling applications by corrupting .exe files so they wouldn't open once they were closed, while also making thousands of connections to servers, saturating the network. "It damaged all the .exe files by corrupting them," says Michaels. "People were logging on and getting a blank screen." The virus was altering the registry of the computers. Maxim Group didn't have a centralized antivirus product in place, having allowed various groups to go their own way with differing products. In response, Maxim Group told the approximately 325 computer users not to shut down the computers while Michaels and his team contacted vendors for assistance.

The decision to change that practice was made on the spot. It wasn't easy. "Symantec took about three days to identify what the variant of the virus was," Michaels says. "They said they had never seen a variant of this." The virus was finally identified as a variant on "Sality," an older virus that strikes at .exe and now also will install a backdoor and Trojan. "We asked Symantec, are we the only ones telling you about this? Antimalware vendor Symantec was called in to set up a centralized antivirus server, while also attempting to analyze what the malware was and advise on clean-up. And they said 'We have 3 million infected.'"Cleaning up more than 300 virus-riddled PCs was a huge headache. In the course of beating back Sality, Michaels says he also contacted another vendor, Cymtec Systems, whose product he had demoed, to install the security vendor's Sentry gateway, which monitors traffic and bandwidth usage, enforcing Web site policies and blocking antimalware.

Symantec advised total re-imaging of the computers, which Maxim Group undertook, a process that consumed several weeks. The reason for the Sentry gateway is to prevent employees from going to "Web sites they probably shouldn't," especially as Web surfing raises the risks of malware infection, Michaels says. To this day, Michaels says he's not sure how the Sality variant got into Maxim Group's network to explode in that April 15 outbreak. "Maybe it was a Web site or a USB device, I don't know," Michaels says. But the virus outbreak also showed there was communication from the infected PCs to what might be a botnet. "They were connecting to rogue Internet sites," Michaels says, saying Sentry would help monitor for that kind of activity in the future. But on that day things changed in terms of the investment firm deciding to enforce stricter Internet usage policies. "Before this episode, we allowed social network sites, but we don't now," Michaels says. And are the old Blaster and Sasser worms that struck with such devastation over half a decade ago gone?

Social networking sites are gaining a reputation as places where malware gets distributed, and if there's no clear business reason for using them, they're put off limits. Unfortunately not, says the "Top Cyber Security Risks" report released this week by SANS Institute in collaboration with TippingPoint and Qualys. The report — which examined six months of data related to 6,000 organizations using intrusion-prevention gear and 100 million vulnerability-assessment scans on 9 million computers to get a picture of various attack types — notes "Sasser and Blaster, the infamous worms of 2003 and 2004, continue to infect many networks."

MediaTek app store to serve Chinese mass market

MediaTek has started shipping a new generation of its widely used mobile phone chips with support for an application download store that will first target China's masses of mobile subscribers, the store developer said Tuesday. The download store, now available only with the new MediaTek chips, is planned to launch outside China later as well, said Luo Tianbo, vice president of business development at Vogins, the middleware vendor that developed the platform. Chips from Taiwan-based MediaTek already power most mobile phones in China.

Handset makers, mobile carriers and other companies have announced plans for similar download stores as a way to lure users and boost revenue. The MediaTek download platform will compete for phone buyers' attention with Apple's App Store and the three mobile carriers' stores. Apple's App Store may launch in China when the iPhone formally goes on sale in the country this year, and China's three mobile carriers are all developing download stores. While the App Store may face regulatory obstacles and China Mobile's store, launched last month, has yet to take off, phones that support the MediaTek store could pour quickly into the hands of Chinese users. The MediaTek store will not "absolutely" compete against the download stores from China's carriers, said Luo. China has a huge market for mobile phones and services with over 700 million mobile subscribers, and MediaTek holds over a 50 percent share of China's handset chip market, according to BNP Paribas.

MediaTek is in talks with China Mobile, China Unicom and China Telecom about altering the Vogins platform to support their stores as well, he said. The Vogins store currently has about 100 free or paid applications made by third-party developers, mostly games but also including other content such as e-books, he said. MediaTek began including support for the application download platform in its chipset packages for mobile phone manufacturers last month, and handsets that support it will go on sale in China around November, Luo said. Vogins, which is majority-held by MediaTek, aims to reach at least 400 to 500 applications by the end of next year. The application store can be accessed from a software platform MediaTek modified from the Nucleus kernel, said Luo.

One hugely popular program it may soon offer is a client for the QQ chat service, owned by Chinese portal Tencent, said Luo. Nucleus is a real-time operating system designed by Mentor Graphics for use mainly on embedded devices. A further boost for the store in China could come from its stock of local applications, JP Morgan said. "We think MediaTek is in a strong position to build a far bigger set of China-specific applications than any other vendor," the note said. The retail price for handsets that support the MediaTek store could reach as low as US$100, partly because the company is using its own OS, JP Morgan predicted in a recent research note. MediaTek did not immediately reply to a request for comment.

Oracle breaks silence on Sun plans in ad

Oracle Corp. ended it silence Thursday on its post-merger plans for Sun Microsystems Inc.'s Unix systems in an advertisement aimed at Sun customers to keep them from leaving the Sparc and Solaris platforms. Ever since Oracle announced in April its plans to acquire Sun, its competitors - notably IBM and Hewlett-Packard Co. - have been relentlessly pursuing Sun's core customer base, its Sparc and Solaris users. Oracle's ad to "Sun customers," makes a number of promises that includes spending more "than Sun does now," on developing Sparc and Solaris, as well as boosting service and support by having "more than twice as many hardware specialists than Sun does now." Analysts see Oracle's ad as a defensive move that doesn't answer some of the big questions ahead of the $7.4 billion merger with Sun . In fact, there may be a lot of room for skepticism and parsing of Oracle's claims, despite their apparent black and white assertions. Among the top hardware makers, Sun registered the biggest decline in server revenue in the second quarter, offering evidence that this protracted merger may be eroding Sun's value.

Europe is allowing until mid-January to sort this out, which keeps the merger in limbo for another quarter. Oracle wanted the acquisition completed by now but the European Commission this month said it would delay its antitrust review because of "serious concerns" about its impact on the database market. Analysts point out that Oracle's plans to spend more "than Sun does now," may be a little hallow because Sun's spending on developing Sparc and Solaris is probably at a low. "The ad sounds convincing - but perhaps being a word nitpicker, the Sun does now' might not mean much if Sun has drastically cut back due to plummeting sales," Rich Partridge, an analyst at Ideas International Ltd., said in an e-mail. "I think someone at Oracle suddenly realized that Sun was bleeding so badly that what would be left when Oracle finally got control would be worth a small fraction of what they paid and no one would buy the hardware unit," Rob Enderle, an independent analyst, said in an e-mail. But Enderle said the ad's claims do not preclude Oracle from selling its hardware division, and says the company "will have to support the unit for a short time after taking control; during that short time they can easily outspend Sun's nearly non-existent budgets." Gordon Haff, an analyst at Illuminata Inc., said if it was Oracle's plan to start on day one of the merger to shop the Sparc processor around, "would they have put this ad out? Taken at face value, the ad seems to indicate that Oracle will keep Sun's hardware and microprocessor capability and not spin it off, as some analysts believe possible. Probably not," he said. "Does it preclude Oracle from changing their mind?

Indeed, Oracle's major competitive concern was indicated in the ad in a quote by Oracle CEO Larry Ellison: "IBM, we're looking forward to competing with you in the hardware business." No. Companies change their mind all the time." An erosion of Sun's customer also hurts Oracle, because a lot of Sun customers are also Oracle customers, and Oracle doesn't want its existing customer to go to IBM and move away from Oracle's platform, Haff said.

iTunes gains Automatically Add to iTunes feature

One of the often requested features for iTunes has been the ability to set a folder for it to watch, automatically adding any items you drop in that folder to its library. In typical Apple fashion, it's not exactly what people were asking for, but Apple's interpretation of what they want. In iTunes 9, Apple has quietly added this feature, although I wouldn't blame you for not having noticed its existence.

When you install iTunes 9, it automatically creates an Automatically Add to iTunes folder in your ~/Music/iTunes/iTunes Music folder (or under ~/Music/iTunes/iTunes Media if you created a new library after installing iTunes 9). When you put an iTunes-compatible media file in this folder, it will, as the name suggests, be added to iTunes automatically. Whenever you drop any file into that folder, it's instantly added to iTunes if the application is running. In my limited testing, I've found that it pretty much works as advertised. If not, it gets added the next time iTunes is launched. And if you ever delete or rename the Automatically Add to iTunes folder, iTunes simply creates a new one for you the next time it is launched. It even looks for files in subfolders you create and adds them to the library as well.

However, it does have a lot of caveats. You can be pretty assured that if the video was downloaded from the Internet, it will not be supported by iTunes. For one thing, iTunes's list of supported formats, especially in the video department, is comically short. In such a case, iTunes will move it to a Not Added subfolder within the Automatically Add to iTunes folder. Still, there are other problems.

But that's to be expected because iTunes has never exactly supported a host of media formats. When users asked for an option to direct iTunes to a folder, they really wanted an option to direct iTunes to any folder. So if you have a huge collection of media in your Movies folder or on an external hard disk drive containing files that you'd like to automatically add to iTunes, you'll still have to move them to that particular folder. What Apple has done, on the other hand, is created a pre-designated folder for the task and not given an option to change it to any other location. What's the point, then?

Well, you say, we can just use the Automatically Add to iTunes folder as our primary movies folder, then-maybe even move it to a location of our choosing, and leave behind an alias to take its place. You can just drag and drop them onto the iTunes icon in the Dock and be done with it. Wouldn't that work? Not only does iTunes not accept anything added to that folder if you move it, but the presence of the alias prevents iTunes from creating a new version of the folder either. Not so much.

And when iTunes does add media files from the Automatically Add to iTunes folder, it moves them into its media folder and organizes them as it normally would, even if you have the option to do so disabled under iTunes's advanced preferences. The only possible use I can see if for you to set it as the default download location for media files you purchase/download off the Internet, so that they can automatically be added to iTunes without your having to do so (and even there, Apple has recommended you don't use it for incomplete files). I hope Apple rethinks this and gives users the freedom to use any folder they want and makes iTunes stop moving the media files around if the user doesn't want it to. It also deletes any subfolders you create within that folder (although that's a logical conclusion, given that they're useless if the media files you put in them never stay there). In short, I don't think the feature is very useful in the form Apple chose to implement it. It's still a (very small) step in the right direction though.

ACLU files lawsuit on border laptop searches

The American Civil Liberties Union (ACLU) has filed a lawsuit demanding that U.S. Customs and Border Protection (CBP) release details of its policy that allows the agency to search travelers' laptops at U.S. borders without suspicion of wrongdoing.

The ACLU's lawsuit, filed Wednesday in the U.S. District Court for the Southern District of New York, is an effort to get CBP to respond to a Freedom of Information Act (FOIA) request that the civil liberties group filed in June about the laptop-search policy. The agency has not supplied any information, although the FOIA law requires it to give a response within 30 days, said Catherine Crump, staff attorney with the ACLU First Amendment Working Group.

The FOIA request and the lawsuit seek details about the laptop search policy, including how many laptops have been searched since the CBP instituted its search policy last year, Crump said. "Traveling with a laptop shouldn't mean the government gets a free pass to rifle through your personal papers," she said.

The ACLU and other civil liberties groups have complained that the CBP policy violates the Fourth Amendment to the U.S. Constitution, protecting U.S. citizens against unreasonable search and seizure.

The ACLU also wants to know how many laptops and electronic devices CBP has seized, how long CBP has kept those devices, and statistics about the race and ethnicity of the people whose laptops have been seized, according to the ACLU's FOIA request.

One Muslim group complained in April that CBP has unfairly targeted Muslim, Arab and South Asian Americans for laptop searches.

"The goal is that the public should have enough information to evaluate the risks of crossing the border with a laptop," Crump said. "It would be helpful to the public if they could evaluate whether this policy makes Americans any safer."

The press office of the U.S. Department of Homeland Security, CBP's parent agency, didn't immediately respond to a request for comments on the ACLU lawsuit.

CBP has asserted that it can search all files, including financial documents and Web browsing history, on travelers' laptops and electronic devices "absent individualized suspicion." The agency does need probable cause that a crime has been committed to seize a device.

The CBP policy also allows the agency to conduct searches of "documents, books, pamphlets and other printed material, as well as computers, disks, hard drives and other electronic or digital storage devices," without suspicion of a crime.

Several Democratic members of the U.S. Congress have pushed for a change in the policy. The requested documents would be "enormously useful" for lawmakers debating the CBP policy, Crump said.

NASA: Shuttle thruster fails day before re-entry

NASA today discovered that a forward thruster that controls the space shuttle Endeavour's speed and altitude is not working properly. The finding comes just a day before the shuttle is due to land at Kennedy Space Center.

NASA discovered the issue today during a pre-landing test of Endeavour's Reaction Control System steering thrusters. The problematic front thrusters, which help control Endeavour after the deorbit burn, failed during testing, NASA said.

Bill Jeffs, a NASA spokesman, said the failed thruster simply won't be used during Friday's scheduled re-entry and landing. "They'll be able to land without it," he said. "The flight control system is in great shape."

According to NASA, the shuttle crew today also tested Endeavour's aerodynamic control surfaces that will be used once the shuttle enters the Earth's atmosphere. That system operated as expected.

The shuttle is scheduled to land at 10:48 a.m. EDT on Friday after spending 16-days aloft. The seven-person crew spent nearly 11 days docked with the International Space Station where they made five spacewalks and used three different robots to install the final piece of the Japanese laboratory, along with spare parts and six new batteries to hold power created by the station's solar arrays .

NASA has said this mission was one of its most technically challenging ever as it called for a whopping five spacewalks and the use of three different robots - two of which were required to work in conjunction to literally hand off massive parts to each other. The robotic arm onboard the space station even made several Slinky-like "walks" down the backbone of the station.

The next shuttle mission is scheduled to launch on Aug. 18, when the Discovery is expected to deliver 33,000 pounds of supplies and equipment to the space station.

Microsoft's Mundie describes computing shift

In the future, computers will do more work automatically for people, rather than reacting to human input, Microsoft's head of research and strategy said on Monday.

 

"I've lately taken to talking about computing more as going from a world where today they work at our command to where they work on our behalf," said Craig Mundie, chief research and strategy officer for Microsoft.

"As powerful as computers are... they're still a tool. If you haven't done an apprenticeship and you don't know how to master the tool, you don't get as much out of it as you might," Mundie said.

 

Mundie addressed a group of university professors and government officials at the company's annual Faculty Summit, held on Microsoft's headquarters in Redmond, Washington.

 

This subtle shift at Microsoft comes after 10 or 15 years of work on trying to enhance the user interface for computers. That work has included handwriting, gesture, voice and touch interaction, but largely used in the context of the existing graphical user interface.

 

About a year ago that work shifted a bit, in anticipation of technology improvements that would allow researchers to begin to apply these different ways of interacting with computers in a new way, beyond simply replacing the keyboard and the mouse.

 

"The question is, can't we change the way in which people interact with machines such that they are much better to anticipate what you want to do and provide a richer form of interaction," Mundie said.

 

He compares this shift to a historical one that Nathan Myhrvold, his former boss, once pointed out. Myhrvold noted that video cameras were first used to record plays. Not until a few years later did people realize that they could create something new and glue together pieces of film to make a movie. "That's kind of what we're going through with computing," Mundie said.

 

As an example of what he envisions, Mundie showed off the latest version of a digital personal assistant. The company showed off the first version about a year ago and the application was one that would let Microsoft employees speak to an image of a person on a computer screen to schedule a shuttle bus on campus.

The latest version, which Mundie demonstrated in a prerecorded demo, shows a monitor placed outside the door of an office. Someone walks up to the office and the face on the screen wakes up, greeting the person and asking if he'd like to talk to Eric, who works in the office. She informs the visitor that Eric is in a meeting and offers to schedule some time for him to meet Eric. After the visitor swipes his badge, she compares his and Eric's schedules and finds a time for them to meet.

 

Microsoft has learned some things about the requirements of such an application, were it to be commercially offered. When idle, the application uses 40 percent of the compute power of the machine, because it is constantly aware of its context. "That makes it so clear to me that this will have to be built on a hybridized client plus cloud architecture," Mundie said.

 

Microsoft often talks about combining local computing with Internet based computing. The concept, which works well for Microsoft because of its business model based on software sales, is slightly different from Google's vision which relies more on remote hosted services.

 

But running an application like the assistant remotely would produce an unusable service, Mundie said. The assistant must respond to people relatively quickly. "That's not likely to be computed in real time if you interpose the latency of a wide area network in the middle of it," he said.

 

While the digital assistant demo was based on real technology, Mundie showed another demo of a vision for the future that might be possible when applying his vision for computers that anticipate users' needs.

 

The demo showed an office of the future. In the center is a desk with a large screen like the Surface device set against two walls that show projected images. Mundie used gestures to move documents and files around the wall surfaces and used a virtual keyboard on the screen in his desk.

 

One wall acted like a digital white board, where Mundie could save the contents of the white board after a meeting. He held up a page of paper with information printed on it and with a tap on the wall, copied the document to the wall. He also dragged a document from his phone to the wall, using gestures.

 

Mundie also pulled up an image of an architectural model that stretched across both walls. As he walked from one end of the wall to the other, the image moved, as if he was changing his perspective of the image in three dimensions. "Because cameras are tracking my position as I move it computes my eyepoint to be what it would look like from that location," he said.

 

He called the demo "half smoke and mirrors and part real." Some of the touch and gesture interactions were live technology, but his interactions with a digital assistant and with a person on a video call were prerecorded videos. But all the features are possible, he said. "If we don't know how we're going to make it work, we won't include it" in such a demo, he said.

 

The rest of the Faculty Summit will include presentations by Microsoft executives and partner researcher presenting information about collaborative projects.

FTC Eyes Blogs for Conflicts of Interest

Scanning Weblogs for product information and reviews has become a cornerstone in contemporary consumerism. Often, readers appreciate the opinion of someone who is not an expert to guide them to the right product. What some consumers may not know is that some of these writers are being paid for their smiles in the form of cash, free products and lavish trips.

The U.S. Federal Trade Commission wants to crack down on this, the latest form of payola, as part of its longstanding charge to protect consumers from false and misleading advertising. In order to do so, the organization proposes searching blogs for misleading information and failure to publicize potential conflicts of interest. That means any blog that reviews products - whether it is padded by corporations or not - is subject to a thorough examination.

The updated FTC Guidelines Concerning the Use of Endorsements and Testimonials in Advertising [PDF] are expected to pass later this summer, possibly with some modifications. If the plans are approved, the FTC will actively go after bloggers who fail to disclose if they're being compensated for their words. The FTC could then order violators to stop and pay restitution to consumers, or even sic the Justice Department on them for civil penalties. While penalties for failure to disclose financial interest have existed in the past, the FTC is broadening the scope of the media it is actively reviewing to cover blogs and bloggers. It may have been watching before, but now it's prepared to pounce.

There currently is no regulation in place specifically regarding bloggers and payola. Professional journalists and product reviewers are beholden to their employers and professional ethics policies, often have to return review products after the review is published. (Here are PC World's Editorial Guidelines.) But because bloggers have no such guidelines unless they impose them on themselves, the rules are out the window and corporations can stuff wallets all they want, in any way they want.

It's a good idea to crack down on this new wave of unprofessional behavior. Nonprofessional product review blogs should maintain the integrity of an Internet community where average citizens can freely share ideas without the threat of being swindled by a massive corporation. By accepting payment and benefits from these companies without specifically stating that is being done violates the trust of a community and serves to destroy its very foundation.

But it's also unnerving that the FTC is broadening the scope of the media it is watching to zero in on bloggers. This means that any blogger who is or is not paid by a corporation to deliver positive reviews may be under scrutiny. It also means that if you review products on your personal blog, there's a good chance that blog will be read by government suits. While many product review blogs are public and desire Web traffic, some may not desire this kind of publicity and still be haunted by the feds.

While the FTC is within its charter to exercise the same oversight of the Internet that it does with other media, this case highlights how the government is taking a more hands-on approach to the Net. This is a good thing, but it also runs the risk of raising hairs on the necks of privacy watchers who aren't so keen on having their personal blogs examined so closely.