Google to offer ‘ultra high-speed’ broadband in US

Google is spreading its wings in yet another direction – this time as a network provider, offering super-fast broadband to thousands of US homes.

It plans to build a fibre-optic network offering speeds of up to 1Gbps (gigabit per second) to up to 500,000 homes.

It said it would compete on price with other broadband providers offering much slower speeds.

Google said the trial was about promoting killer apps that would take advantage of fast speeds.

“We planning to build and test ultra high-speed broadband networks in a small number of trial locations across the United States,” the search giant said in its blog.

“We plan to offer service at a competitive price to at least 50,000 and potentially up to 500,000 people. We’ll deliver internet speeds more than 100 times faster than what most Americans have access to today,” it continued.

Growing Google

Google already has a fibre network which connects its data centres, speeds up search and lowers the cost of streaming video on YouTube.

Now it plans to take this to the next stage and connect that network directly to consumers’ homes.

The network will be available for any service provider to use and Google is asking interested parties, from local government as well as members of the public, to sign up to the plan.

The offer is part of Google’s expansion into controlling all aspects of a web user’s experience.

In late 2009 Google offered a service called public DNS, which it said would speed up web browsing for users.

The domain name system is a series of databases that translate web addresses into computer readable numbers called IP addresses.

“The average Internet user ends up performing hundreds of DNS lookups each day, and some complex pages require multiple DNS lookups before they start loading,” Google wrote in a blog at the time. “This can slow down the browsing experience.”

DNS requests are usually handled by a person’s Internet Service Provider (ISP).

In November 2009, the search giant also announced that it was working on a project to develop a faster version of http – which Google call SPDY – to speed up the transfer of content over the web.

At the time, the firm said its lab tests had shown that web pages loaded “up to 55% faster” using the protocol.

Broadband expansion

The US, in common with countries around the world, is grappling with the best way to roll out next generation broadband.

In the first month of his office President Obama promised to spend $7.2bn on new broadband infrastructure.

The pot of money is available for smaller broadband providers and municipalities.

For Drew Clark, editor of US broadband comparison website BroadbandCensus, the impact of Google’s entry into the broadband market will depend on how many homes the network serves.

“If it is 50,000 homes then that isn’t a lot. But if it is 500,000 then that is a statement to say it is in the market place competing with the likes of Verizon,” he said.

Verizon has made big investments in fibre networks, with plans to pass 18 million homes with its fibre-optic service by the end of the year.

Rival AT&T has 17 million households in its network but its fibre network does not run all the way to the home.

Google said it hopes its fibre network will act as a testbed for new applications.

Andrew Ferguson, editor of UK broadband website ThinkBroadband speculated as to what some of those killer apps could be.

“One idea would be to expand Google Streetview, so that you can play a movie of a route you wanted to take, so that when you are visiting a brand new area you will have a much better understanding of the area than is possible from simple static street view maps,” he said.

White House puts companies on notice in China

WASHINGTON–U.S. Internet companies might soon need to find a new strategy for dealing with China.

In announcing that it is now U.S. policy to advocate a free and open Internet around the world, Secretary of State Hillary Rodham Clinton on Thursday essentially dared U.S. companies to follow Google’s lead and put an end to their complicit censorship of Internet content. Google has said it will shut down its Chinese search engine if it can’t find a way to offer an uncensored version under Chinese law, and while no one else has jumped on that bandwagon, they may soon have little choice.

“…We are urging U.S. media companies to take a proactive role in challenging foreign governments’ demands for censorship and surveillance. The private sector has a shared responsibility to help safeguard free expression. And when their business dealings threaten to undermine this freedom, they need to consider what’s right, not simply what’s a quick profit,” Clinton said in remarks Thursday at the Newseum, before an audience including members of Congress, representatives from nonprofit groups, and perhaps more than one Internet company executive forced to ponder the meaning of that paragraph.

Clinton stopped short of actually proposing regulations or sanctions on Internet companies that comply with censorship laws. But her tone was clear: it’s now the policy of the U.S. government to renounce corporate “engagement,” or the belief that by merely being in countries like China, U.S. Internet companies are helping expand access to information.

Will it work? Google, Microsoft, and Yahoo have already formed the Global Network Initiative, a consortium of companies and organizations designed to provide guidelines for operating in countries with authoritarian governments without turning into tools of those governments. Clinton acknowledged the work of the GNI during her speech, but is calling on companies to do more.

Microsoft declined to directly address its plans for China in a statement, but thanked Clinton for recognizing the GNI. “We welcome Secretary Clinton’s remarks and applaud the heightened attention she has brought to these issues of privacy and freedom of expression. We agree with Secretary Clinton that both governments and the private sector have important roles to play,” the company said. Last week, Microsoft CEO Steve Ballmer said that the company remained committed to China despite Google’s announcement.

Google, which was recognized during Clinton’s speech for “making the issue of Internet and information freedom a greater consideration in (its) business decisions,” said it welcomed the challenge. “Free expression and security are important issues for governments everywhere, and at Google we are obviously great believers in the value to society of unfettered access to information. We’re excited about continuing our work with governments, human rights organizations, and bloggers, to promote free expression and increased access to information in the years ahead,” it said in a statement.

Yahoo did not respond to a request for comment.


Wi-Fi ‘allergies’ leave man homeless

The Santa Fe New Mexican reports a man claiming to suffer from electromagnetic sensitivity is suing his neighbor for refusing to disconnect her electronic devices.

Santa Fe, New Mexico resident Arthur Firstenberg claims that his neighbor Raphaela Monribot’s use of electronic devices such as cell phones, computers, compact fluorescent lights and dimmer rheostats is aggravating his “electromagnetic sensitivity” and causing him to get sick.

“Within a day of [Monribot] moving in, I began to feel sick when I was in my house,” Firstenberg writes in his affidavit. “The electric meter for my house is mounted on [Monribot’s] house. Electromagnetic fields emitted in [Monribot’s] house are transmitted by wire directly into my house.”

A request for preliminary injunction claims Fristenberg’s condition has left him homeless. Fristenberg “cannot stay in a hotel, because hotels and motels all employ wi-fi connections, which trigger a severe illness. If [Firstenberg] cannot obtain preliminary relief, he will be forced to continue to sleep in his car, enduring winter cold and discomfort, until this case can be heard.”

The Santa Fe New Mexican notes “Firstenberg’s motion is accompanied by dozens of notes from doctors, some dating back more than a decade, about his sensitivities.”

However, scientific studies such as this 2005 trial at the Psychiatric University Hospital in Germany suggest electromagnetic sensitivity is strictly a psychosomatic disorder.

The major study endpoint was the ability of the subjects to differentiate between real magnetic stimulation and a sham condition. There were no significant differences between groups in the thresholds, neither of detecting the real magnetic stimulus nor in motor response.

We found no objective correlate of the self perception of being “electrosensitive.” Overall, our experiment does not support the hypothesis that subjectively electrosensitive patients suffer from a physiological hypersensitivity to EMFs or stimuli.

Do you acknowledge Fristenberg, and others claiming electronic sensitivity, may be suffering real physiological effects and should be allowed to live free from electronic devices? Or should treatment be strictly psychological?


FCC’s underwhelming-looking broadband plan also tardy

Time to reset the game clock on the National Broadband Plan.

The plan, due to be presented to Congress next month, now looks to be delayed by a month. Like a tardy student going to a professor, the FCC has written Congress to ask for a four-week extension on its “big paper.” Perhaps the agency can use the extra time to ensure the plan contains some of that “change” we’ve heard so much about.
Filling gaps

In December, we were given a sneak peak at the plan-in-progress. While interesting, it was also underwhelming.

Broadband plans in other countries have done things like aim for 1Gbps connections by 2015 (Japan), separate last-mile copper and fiber networks from backend networks and open them to all competitors (UK), and even build an open access national fiber network (Australia). The US, in contrast, looks ready to find and auction off some extra wireless spectrum in five years or so; it might also require rural telcos taking universal service money to provide low-speed broadband to all their lines. Oh, and some people might get Internet access on their TV sets.

The policies we’ve seen so far look good, and the FCC has had an impressive team working on the issue for nearly a year now, but the final result looks a lot like “tinkering around the edges” rather than doing something truly game-changing. The FCC commissioned a major report from Harvard researchers on world broadband markets, and that report made essentially one recommendation: mandate line-sharing rules to provide real competition. But it’s not in the plan.

The Department of Commerce also weighed in this week, telling the FCC that US broadband was generally a duopoly, that wireless really wouldn’t be a replacement for wireline networks, and that providing more spectrum wouldn’t fix the competitive situation.

Everyone’s calling for bold action on broadband, even Republicans like Sen. Kay Bailey Hutchison (R-TX). In an op-ed this week, Hutchison demanded a “daring, comprehensive” plan. What was her main idea for such a plan? Additional wireless spectrum.

The FCC continues to insist it will deliver something solid. “Gaps” in US broadband access will be addressed “boldly,” said FCC Chair Julius Genachowski this week. Extending coverage to all is a good thing. Opening up spectrum, especially to unlicensed use (which brought us WiFi and now White Space Devices) is a good thing. But nothing coming from the FCC looks likely to push US ISPs to be truly awesome on the world stage (see our piece on incredible ISPs around the world, and take a look at the service they are already providing before you say it can’t be done here).

ISPs like France’s already offer ADSL connections of up to 28Mbps that provide TV, Internet, and phone service for €29.99, showing just how much can really be done by the right kind of competition. Meanwhile, Americans can pay $35/month for 6Mbps Internet-only DSL connections with customer service like this.

The FCC’s point man for broadband, Blair Levin, has essentially ruled out line-sharing already, and he’s also right that just “thinking big” without having a plan to get there is ineffective. And yes, some of the high speeds advertised in other countries can’t be obtained in reality. But a look round the world shows that broadband can at least be done better, it can certainly be done cheaper, and success is often a function of the regulatory environment. That doesn’t mean government-run broadband; it just means that the ground rules truly encourage competition, the sort of competition that both the Department of Commerce and the Department of Justice don’t currently see in the market.

We’ll reserve final judgment on the FCC’s efforts until March, when the National Broadband Plan is revealed in its full splendor. But we’re skeptical that a few more weeks will lead the agency to think any bigger. When JFK announced that the US would race for the moon, he said we would pursue moon landings “not only because they are easy, but because they are hard.” When it comes to broadband, it looks like we’ll be doing a host of good—but pretty easy—things.


Mars rover Spirit’s days may be numbered

One of NASA’s seemingly immortal Mars rovers might soon be at the end of its days.

The Spirit rover had been cruising around the Red Planet, along with its companion, Opportunity, since they both arrived six years ago this month. (Spirit landed on January 3, 2004, while Opportunity landed on January 24 of that year.) Their mission to send back photos and data about the Martian surface was expected to last a mere 90 days. Instead, the two traveling research bots blew away all expectations, continuing their treks year after year.

However, scientists warn that Spirit’s most recent anniversary might have been its last. The rover became stuck in a sand trap nine months ago, after one of its wheels broke through a crusty layer of soil into a pocket of loose sand. It wasn’t the first time Spirit has run into trouble. Its right-front wheel stopped working in 2006, and a month ago, its right-rear wheel began to fail.

Scientists continue to try to get Spirit out of the sand pit, but so far those efforts have been unsuccessful. Wiggling the wheels and rotating them very slowly have resulted in only minimal improvements in the situation. Next, NASA could try having Spirit drive backward or use its robotic arm to sculpt the ground directly in front of one of its wheels. But expectations are low, and on Wednesday, NASA said it is running out of maneuvers to attempt.

All of this is worsened by the fact that the rovers are solar-powered, which means they need to collect sunlight with their onboard solar panels in order to power their operations and create enough heat to survive the frigid winters on Mars.

In the southern hemisphere of Mars, where Spirit is trapped, it is currently autumn–so precious sunlight is declining with each day. The rover also happens to have settled into a position that’s far from ideal for collecting what sunlight remains. It’s tilted five degrees to the south, but the sun is in the north.

Even if Spirit cannot escape its sandy prison, all isn’t necessarily lost–at least for now. Ray Arvidson, who’s from Washington University in St. Louis and who also serves as deputy principal investigator for the rovers, says that if scientists can improve Spirit’s tilt, it might be able to collect enough power to keep doing research right where it is.

“We can study the interior of Mars, monitor the weather, and continue examining the interesting deposits uncovered by Spirit’s wheels,” said Arvidson in a statement.

If the team cannot free Spirit or improve its angle, NASA estimates that the rover will run out of power in May–if not sooner.

Meanwhile, Spirit’s sister rover, Opportunity, keeps rolling on. It is currently making the seven-mile trek from Mars’ Victoria crater to the Endeavour crater to continue its research.


Tech Boom: Intel’s Earnings Up an Astounding 875%

We thought Zillow’s 2011 IPO was a good sign for the tech and Internet market. Intel has not only just confirmed that notion, but blown everybody’s expectations right out of the water.

The world’s largest chipmaker just wowed Wall Street and the tech world with its latest earnings report. The publicly-traded company reported a net income of $2.3 billion in the fourth quarter of 2009, up an amazing 875% from its $234 million earnings in the fourth quarter of 2008. This more than beat Wall Street expectations.

While we won’t go into detail over the financial numbers (you can do that here PDF ), we do want to highlight some of the key stats:

– Revenues in Q4 2009 rose to $10.6 billion, a climb of 28% from $8.3 billion last year.

– However, if you look at the big picture, Intel had a better 2008 than 2009. 2009 revenues were $35.1 billion, while 2008 revenues reached $37.6 billion. That’s a 7% difference.

– Intel predicts revenues of approximately $9.7 billion in Q1 2010, above Wall Street estimates.

– Around a year ago, at the heart of the economic collapse, Intel decided to invest $7 billion into new chip plants. It looks to be paying off.

Intel’s Q4 report is one of the first to come out this year, but it won’t be the last. If Intel’s numbers are any indication though, we’re nearing the light at the end of the tunnel.


USB 3.0 Finally Arrives

When you’re in front of your PC, waiting for something to transfer to removable media, that’s when seconds feel like minutes, and minutes feel like hours. And data storage scenarios such as that one is where the new SuperSpeed USB 3.0’s greatest impact will be felt first. As of CES, 17 SuperSpeed USB 3.0-certified products were introduced, including host controllers, adapter cards, motherboards, and hard drives (but no other consumer electronics devices). Still more uncertified USB 3.0 products are on the way, and they can’t get here fast enough.

Glance Backward

The beauty of USB 3.0 is its backward compatibility with USB 2.0; you need a new cable and new host adapter (or, one of the Asus or Gigabyte motherboards that supports USB 3.0) to achieve USB 3.0, but you can still use the device on a USB 2.0 port and achieve typical USB 2.0 performance. In reducing some overhead requirements of USB (now, the interface only transmits data to the link and device that need it, so devices can go into low power state when not needed), the new incarnation now uses one-third the power of USB 2.0.

The theoretical throughput improvement offered by USB 3.0 is dramatic — a theoretical 10X jump over existing USB 2.0 hardware. USB 2.0 maxed out at a theoretical 480Mbps, while USB 3.0 can theoretically handle up to 5Gbps. Mind you, applications like storage will still be limited by the type of drive inside; so, for example, you can expect better performance from RAIDed hard drives or fast solid-state drives (SSDs) than from, say, a standalone single drive connected to the computer via USB 3.0.

The real-world examples are fairly convincing — and underscore USB 3.0’s advantage for high-def video, music, and digital imaging applications. Our early test results are encouraging as well: We tested Western Digital’s My Book 3.0, the first USB 3.0-certified external hard drive. The performance was on a par with that of eSATA-but the benefit here is that USB 3.0 is a powered port, so you don’t need to have another external power supply running to the drive (as you do with eSATA; unless the eSATA drive you’re using is designed to steal power from a USB port while transferring data over the eSATA interface).

New Entries

While the WD drive was the first to announce, a slew of other hard drive makers either announced products at the show, or discussed plans to release products in the coming months. Among them: Seagate (which is doing a portable drive), LaCie, Rocstor, and Iomega. Even non-traditional hard drive vendors like Dane-Elec and A-Data showed products they billed as USB 3.0 (the latter two even had USB 3.0-connected SSDs, the first external drives to use solid-state storage inside.

One of the things to look for in the coming months is the certified SuperSpeed USB 3.0 logo. Products are currently filling the queues at the official certification testing labs, but presence of that certification logo will give you some peace of mind that the product you’re buying truly does live up to the USB 3.0 spec.

Given that the certification labs are jammed up, though, you can expect companies to release USB 3.0 products without official certification. (Buffalo Technologies’ drive, released late 2009, is not certified; LaCie’s drives are in the process of certification, but will initially carry LaCie’s own logo for USB 3.0, and will gain a sticker on the box once certification is completed.) And in those cases, it will be hard to know whether the device truly lives up to its performance potential.

Compatibility Guarantee

And this time around, the way the USB spec is written, says Jeff Ravencraft, consumers should have an easier time finding products that are truly USB 3.0. Before, in the transition from USB 1.1 to USB 2.0, the USB 2.0 spec was written in a way where it “encompassed low, full and high-speed USB,” explains Ravencraft, president and chairman of the USB Implementers Forum. “Since those are all encapsulated in the USB 2.0 spec, [vendors] could have a certified product that’s low-speed, but still call it USB 2.0.

“We don’t have that issue with USB 3.0 To claim you’re USB 3.0, you have to deliver 5Gbps. There’s no other way to get the certification.”

Ravencraft adds that the group is prepared to protect the USB 3.0 logo, to make sure that only manufacturers who go through certification use it. “We’ll take legal action if anyone infringes on our marks.”

By end of year, Ravencraft says the loggerjam of products awaiting certification should be past, and the organization’s network of worldwide test labs will be handling USB 3.0 certification.

According to In-Stat Research, by 2013, more than one-quarter of USB 3.0 products will support SuperSpeed USB 3.0.

Ravencraft says this is the fastest ramp up of USB products he’s seen in the past ten years, across the previous versions of USB.

I say the change can’t come fast enough. The trick, though, will be getting the interface into our notebooks (without requiring a kludgy ExpressCard adapter). So far, though, only HP and Fujitsu have announced limited USB 3.0 support on notebooks. And Taiwanese notebook and desktop maker MSI indicated that it wouldn’t have USB 3.0 until, at the earliest, the third-quarter of this year; product managers for both notebooks and desktops cited manufacturing concerns like chipset availability in large quantities, and the need to test USB 3.0 chipsets.

And in the meantime, the only announced peripherals remain storage devices. At next year’s CES, it’s likely we’ll hear more about specific consumer electronics devices such as digital cameras and camcorders and video cameras moving to USB 3.0. Hopefully by then we’ll start getting a critical mass of PC hardware with USB 3.0 integrated, too.


Google Plans Ultrafast Internet Broadband

The search giant say it’ll build experimental 1-gigabit-per-second broadband networks in a small number of test locations.

Google on Wednesday said that it plans to build a series of experimental high-speed networks that will provide broadband connectivity at speeds 100 times beyond typical U.S. broadband connections.

Under The American Recovery and Reinvestment Act of 2009, signed into law in February 2009, the Federal Communications Commission (FCC) was directed to create a National Broadband Plan to promote better online communication and scientific, economic, and cultural development.

Google has been advising the FCC on the plan’s development. With 35 days until the FCC unveils its plan, Google has decided to build high-speed broadband networks in a small number of test locations. The company is promising Internet speeds of up to 1 gigabit per second, through fiber-to-the-home connections.

Google said it will offer network access to between 50,000 and 500,000 people at a competitive prices.

“We doing this because we want to experiment with new ways to make the Web better and faster for everyone, allowing applications that would be impossible today,” said Google product manager James Kelly in a video.

Examples of such applications include 3D medical imaging over the Web, downloading high-definition feature films in less than five minutes, and collaborating with geographically dispersed classmates while watching a live, 3D lecture.

Google expects that the availability of high-speed Internet access will allow developers to create new applications that haven’t yet been imagined.

The company says that its experimental networks will be operated under “open access” principals, so that users have a choice of service provider, and that its networks will be managed in an open, transparent, and non-discriminatory way.

Google is soliciting involvement from community partners through a Request For Information (RFI). Government officials and members of the public can nominate their communities to be test participants at Google’s Web site before March 26.

Source: InformationWeek

If Your Password Is 123456, Just Make It HackMe

Back at the dawn of the Web, the most popular account password was “12345.”

Today, it’s one digit longer but hardly safer: “123456.”

Despite all the reports of Internet security breaches over the years, including the recent attacks on Google’s e-mail service, many people have reacted to the break-ins with a shrug.

According to a new analysis, one out of five Web users still decides to leave the digital equivalent of a key under the doormat: they choose a simple, easily guessed password like “abc123,” “iloveyou” or even “password” to protect their data.

“I guess it’s just a genetic flaw in humans,” said Amichai Shulman, the chief technology officer at Imperva, which makes software for blocking hackers. “We’ve been following the same patterns since the 1990s.”

Mr. Shulman and his company examined a list of 32 million passwords that an unknown hacker stole last month from RockYou, a company that makes software for users of social networking sites like Facebook and MySpace. The list was briefly posted on the Web, and hackers and security researchers downloaded it. (RockYou, which had already been widely criticized for lax privacy practices, has advised its customers to change their passwords, as the hacker gained information about their e-mail accounts as well.)

The trove provided an unusually detailed window into computer users’ password habits. Typically, only government agencies like the F.B.I. or the National Security Agency have had access to such a large password list.

“This was the mother lode,” said Matt Weir, a doctoral candidate in the e-crimes and investigation technology lab at Florida State University, where researchers are also examining the data.

Imperva found that nearly 1 percent of the 32 million people it studied had used “123456” as a password. The second-most-popular password was “12345.” Others in the top 20 included “qwerty,” “abc123” and “princess.”

More disturbing, said Mr. Shulman, was that about 20 percent of people on the RockYou list picked from the same, relatively small pool of 5,000 passwords.

That suggests that hackers could easily break into many accounts just by trying the most common passwords. Because of the prevalence of fast computers and speedy networks, hackers can fire off thousands of password guesses per minute.

“We tend to think of password guessing as a very time-consuming attack in which I take each account and try a large number of name-and-password combinations,” Mr. Shulman said. “The reality is that you can be very effective by choosing a small number of common passwords.”

Some Web sites try to thwart the attackers by freezing an account for a certain period of time if too many incorrect passwords are typed. But experts say that the hackers simply learn to trick the system, by making guesses at an acceptable rate, for instance.

To improve security, some Web sites are forcing users to mix letters, numbers and even symbols in their passwords. Others, like Twitter, prevent people from picking common passwords.

Still, researchers say, social networking and entertainment Web sites often try to make life simpler for their users and are reluctant to put too many controls in place.

Even commercial sites like eBay must weigh the consequences of freezing accounts, since a hacker could, say, try to win an auction by freezing the accounts of other bidders.

Overusing simple passwords is not a new phenomenon. A similar survey examined computer passwords used in the mid-1990s and found that the most popular ones at that time were “12345,” “abc123” and “password.”

Why do so many people continue to choose easy-to-guess passwords, despite so many warnings about the risks?

Security experts suggest that we are simply overwhelmed by the sheer number of things we have to remember in this digital age.

“Nowadays, we have to keep probably 10 times as many passwords in our head as we did 10 years ago,” said Jeff Moss, who founded a popular hacking conference and is now on the Homeland Security Advisory Council. “Voice mail passwords, A.T.M. PINs and Internet passwords — it’s so hard to keep track of.”

In the idealized world championed by security specialists, people would have different passwords for every Web site they visit and store them in their head or, if absolutely necessary, on a piece of paper.

But bowing to the reality of our overcrowded brains, the experts suggest that everyone choose at least two different passwords — a complex one for Web sites were security is vital, such as banks and e-mail, and a simpler one for places where the stakes are lower, such as social networking and entertainment sites.

Mr. Moss relies on passwords at least 12 characters long, figuring that those make him a more difficult target than the millions of people who choose five- and six-character passwords.

“It’s like the joke where the hikers run into a bear in the forest, and the hiker that survives is the one who outruns his buddy,” Mr. Moss said. “You just want to run that bit faster.”


Antivirus makers applaud, mock Microsoft Security Essentials

Four antivirus makers have weighed in on the release of Microsoft Security Essentials, and their opinions are all over the place. We asked various security companies for their opinion on MSE, which launched yesterday, and Symantec, ESET, Avast, and AVG responded with their thoughts.

Microsoft claims it is targeting consumers who currently don’t have any protection on their Windows PC, but of course MSE will end up on many computers that already have third-party security software installed. Since MSE is free, the software security market is going to get a serious shake-up, and here’s what Microsoft’s new competitors think about what’s about to happen.

Symantec, maker of the Norton line of products, says MSE doesn’t stand a chance in today’s market: “While we applaud any vendor that heightens consumer awareness of the need for computer security, it’s clear that the threat landscape has moved on from the product Microsoft is launching,” a Symantec spokesperson told Ars. “Microsoft Security Essentials (MSE) is a stripped down version of their old OneCare product which was poorly rated by industry experts and users alike. From a security perspective, this Microsoft tool offers reduced defenses at a critical point in the battle against cybercrime. Unique malware and social engineering tricks fly under the radar of traditional signature-based technology alone—which is what is employed by free security tools such as Microsoft’s”

ESET, maker of the NOD32 line of products, is unfazed by the product’s launch: “Certainly basic, but free, protection is better than no protection,” Christopher Dale, Public Relations Manager of ESET, told Ars. “For those whose primary concern is price, we would imagine MSE will hold great appeal while making the freeware market more competitive. The product doesn’t directly impact ESET as we offer a full-featured security solution w/ more configuration choices and free phone support.”

Avast is perfectly fine with Microsoft entering the market: “We are glad to see Microsoft joining us in offering free anti-virus/security protection to users,” Vince Steckler, CEO of Avast, told Ars. “We have long believed that top notch security protection should be freely available—that is why nearly 100 million users around the world protect their computers and data with our free avast! antivirus. Around the world there are about 500 million home computer users that need [to be] protected while using the Internet. We believe only around 20 percent of these users are using a traditional paid security product while 250 million are using avast! or one of the other high-quality free products. Users have already decided that security should be free—there are more users of free avast! than users of all paid products combined. But, free users should not be subjected to inferior or ‘basic’ protection.”

AVG, on the other hand, thinks Microsoft will push its product via as many anticompetitive ways as possible: “Microsoft will likely push MSE out via every automated channel available to them—which in and of itself poses all sorts of interesting anti-trust questions,” Siobhan MacDermott, VP Head of Public Policy, Corporate Communications, and Investor Relations for AVG Technologies, told Ars. “They will focus on gaining consumers through the simplicity of installing the product via routine channels of connection. On paper it makes sense, but in reality, we believe this will force consumers to unwittingly enter into a situation that makes them more vulnerable. Experts agree that the biggest nemesis to Windows was not the vulnerability of its code but rather the popularity of the operating system. It is a law of numbers; large communities create large pools of opportunities for thieves. If Microsoft leverages the power of its OS market to rapidly create a large community of MSE users, we believe those customers will be doubly vulnerable.”

There you have it; two antivirus makers are fine with Microsoft Security Essentials and the other two aren’t. We’re more surprised with the ones that are fine with it, since MSE can potentially steal customers away from them (in fact, many of our readers and users on other forums have already declared they are switching). In our first look at MSE yesterday, we were impressed with what Microsoft was offering as a free download for Windows XP, Windows Vista, and Windows 7. For those who have had a chance to install it, how do your thoughts compare to the above statements?