Wave of Ransom Malware Hits Internet

Criminals reused an attack from 2008 to hit the Internet with a huge wave of ransomware in recent weeks, a security company has reported.

In the space of only two days, February 8 and 9, the HTML/Goldun.AXT campaign detected by Fortinet accounted
for more than half the total malware detected for February, which gives some indication of its unusual scale.

The attack itself takes the form of a spam e-mail with an attachment, report.zip, which if clicked automatically downloads a rogue antivirus product called Security Tool. It is also being distributed using manipulated search engine optimisation (SEO) on Google and other providers.

Such scams have been common on the Internet for more than a year, but this particular one features a more recently-evolved sting in the tail. The product doesn’t just ask the infected user to buy a useless license in the mode of scareware, it locks applications and data on the PC, offering access only when a payment has been made through the single functioning application left, Internet Explorer.

What’s new, then, is that old-style scareware has turned into a default ransom-oriented approach. The former assumes that users won’t know they are being scammed, while the latter assumes they will but won’t know what to do about it.

The technique is slowly becoming more common — see the Vundo attack of a year ago — but what is also different is the size of this attack, one of the largest ever seen by Fortinet for a single malware campaign.

Fortinet notes that Security Tool is really a reheat of an old campaign from November 2008, which pushed the notorious rogue antivirus product Total Security as a way of infecting users with a keylogging Trojan.

“This is a great example of how tried and true attack techniques/social engineering can be recycled into future attacks,” says Fortinet’s analysis.

According to Fortinet, the “engine” pushing the spike in ransom-based malware is believed to be the highly-resilient Cutwail/Pushdo botnet, the same spam and DDoS system behind a number of campaigns in the last three years including the recent pestering of PayPal and Twitter sites.

Source

Energizer Duo battery charger hides a Trojan

The Energizer Duo USB battery charger has been hiding a backdoor Trojan in its software that affects computers using Windows. According to Symantec the Trojan has probably been there since 10th May 2007.

Energizer has now taken the software for the model CHUSB charger off the market and removed the site from which it could be downloaded, and the company is asking customers who downloaded the Windows version to uninstall it. There are easy steps to fight the Trojan in affected machines, and Macintosh users are not affected.

Symantec’s Director of Global Intelligence, Dean Turner, said it’s impossible to be certain the Trojan has always been in the software that monitors the Duo USB charger, but the Trojan’s binary header states it was created in May 2007. It is not known how the Trojan came to be in the software, but malware has previously been found to be hidden inside products. Energizer is working with the US Computer Emergency Readiness Team (US-CERT) and the US government to try to find out how the code found its way into the software.

The Trojan allows an attacker to operate with the same privileges as the user who is logged in, and to remotely control the system via connections on 7777/tcp to send and receive files, run programs, and list the contents of directories.

US-CERT advises that to fix the problem, users can delete the Arucer.dll file from the Windows system32 directory, and then restart the system. An alternative fix is to remove the USB charger software. The Trojan Arucer.dll file will still be present but the code cannot be executed in the absence of the charger software. It is also advisable to block access to port 7777 using a firewall or via network perimeter devices.

Energizer’s Duo USB battery chargers have been available in the US, Europe, Asia, and Latin America since 2007. They allow computer users to recharge the Nickel Metal Hydride (NiMH) batteries either from a wall outlet or a USB connection. It also enabled the user to monitor the status of charging on the PC.

Source

Google Index to Go Real Time

Google is developing a system that will enable web publishers of any size to automatically submit new content to Google for indexing within seconds of that content being published. Search industry analyst Danny Sullivan told us today that this could be “the next chapter” for Google.

Last Fall we were told by Google’s Brett Slatkin, lead developer on the PubSubHubbub (PuSH) real time syndication protocol, that he hoped Google would some day use PuSH for indexing the web instead of the crawling of links that has been the way search engines have indexed the web for years. Google senior product manager Dylan Casey said yesterday at Sullivan’s Search Marketing Expo in Santa Clara, California that the company plans to soon publish a standard way for site owners to participate in a program much like that.

How The System Might Work

PuSH is a syndication system based on the ATOM format where a publisher tells the world about a Hub that it will notify every time new content is published. Subscribers then tell the Hub “when this Publisher posts new content, please deliver it to me right away.” So instead of the Subscriber checking back with the Publisher all the time to see if there’s new content, they just sit and wait to be told that there is by the Hub. The Publisher publishes something, then tells the Hub that it’s available, then the Hub goes and delivers it to all the Subscribers. This can take as little as a few seconds.

If Google can implement an Indexing by PuSH program, it would ask every website to implement the technology and declare which Hub they push to at the top of each document, just like they declare where the RSS feeds they publish can be found. Then Google would subscribe to those PuSH feeds to discover new content when it’s published.

PuSH wouldn’t likely replace crawling, in fact a crawl would be needed to discover PuSH feeds to subscribe to, but the real-time format would be used to augment Google’s existing index.

As Danny Sullivan told us today, Google would have to implement some sort of spam control and not just let content be pushed live to the index unvetted. That was what happened in the earliest days of search and it was a real mess, he told us.

The Advantages of a Real Time Google Index

PuSH is much more computationally efficient for Google but Slatkin says that even more important is the impact of such a move for small publishers. Right now many small sites get visited by Google maybe once a week. With a PuSH system in place, they would be able to get their content to Google automatically right away.

A richer, faster, more efficient internet would be good for everyone, but the benefits in search wouldn’t be limited to Google, either. The PubSubHubbub is an open protocol and the feeds would be as visible to Yahoo and Bing as they would be to Google.

“I am being told by my engineering bosses to openly promote this open aproach even to our competitors,” Slatkin says. That’s a very good sign.

We expect this will be a very big deal and we’ll be covering it more extensively in the coming days, as well as whenever Google has something to announce more formally.

Source

Twitter Turns Fire Hose on Little Guys

Google, Microsoft and Yahoo (YHOO) are no longer the only ones drinking from Twitter’s fire hose of real-time data. On Monday, the company granted seven real-time search and discovery ventures access to the data as well: Ellerdale, Collecta, Kosmix, Scoopler, twazzup, CrowdEye, and Chainn Search. Each will be able to tap into the totality of Twitter’s data stream, sifting and indexing it and using it to further build out their services.

The price of that access? Unknown, but I can’t imagine it’s much. Twitter says it’s charging companies according to an as yet undisclosed scalable licensing scheme. For the likes of Google (GOOG) and Microsoft (MSFT), that means millions of dollars–enough to make Twitter profitable. For these seven upstarts, the fee is substantially less, at least until they evolve into more high-volume users.

In any event, it seems Twitter is finally pushing ahead with a business plan that could begin to justify the venture capital investment it has attracted.

Source

Virgin Media to offer 100Mbps broadband

Virgin Media plans to roll out a broadband service that tops out at 100 megabits per second to residential customers by the end of the year.

The company said Thursday that it will use its fiber-based network to deliver the new high speeds. Virgin Media, which provides broadband, TV, phone and/or mobile phone service to roughly 4.1 million homes across the U.K., currently offers 10Mbps, 20Mbps, and 50Mbps tiers of broadband service. And soon it will be delivering 100Mbps, the company said.

The company says the 100Mbps service will allow users to download a music album in as little as five seconds, an hour-long TV show in 31 seconds, and a high-definition movie in roughly seven minutes. The company claims this is significantly faster than the time it would take a person using the fastest speeds from slower services.

“There is nothing we can’t do with our fibre optic cable network, and the upcoming launch of our flagship 100Mb service will give our customers the ultimate broadband experience,” Virgin Media’s CEO Neil Berkett said in a statement.

In some places, the company says it will be able to boost speeds even faster than 100Mbps. Also on Thursday the company announced it will extend testing of a 200Mbps service to other parts of the U.K. It announced the initial test in May 2009.

In the U.S., the Federal Communications Commission is also hoping that broadband providers will boost speeds to 100Mbps. Last week, FCC Chairman Julius Genachowski said he is challenging U.S. Internet service providers to offer 100Mbps to 100 million homes within the next 10 years. He is making this goal a part of the National Broadband Plan that will be presented to Congress next month.

The FCC has been encouraging “test beds” where these ultra high-speed broadband networks can be tested. Google has already announced it plans to launch experimental ultra-high speed networks to test new applications. Rumors have also floated around that Cisco Systems, the company which makes the routers and switches that power the Internet, will make a big announcement on March 9 about a similar type of test bed.

Source

New Windows software turns one PC into many

Microsoft announced Wednesday that it is ready with Windows MultiPoint Server 2010, a product that lets schools run a classroom full of systems using just a single computer.

Based on Windows Server 2008 R2, Multipoint allows up to 10 different set-ups, each with their own keyboard, mouse, and monitor to run from a single server.

“We heard clearly from our customers in education that to help fulfill the amazing promise of technology in the classroom, they needed access to affordable computing that was easy to manage and use,” Microsoft vice president Anthony Salcito said in a statement.

Microsoft had said in November that it was working on the product.

NComputing, which already offers a similar approach using both Linux and standard versions of Windows, said it will incorporate MultiPoint Server across its product lineup.

Hewlett-Packard, ThinGlobal, Tritton and Wyse also plan to build products based on the software.

Source

Bloom box: An energy breakthrough?

In the world of energy, the Holy Grail is a power source that’s inexpensive and clean, with no emissions. Well over 100 start-ups in Silicon Valley are working on it, and one of them, Bloom Energy, is about to make public its invention: a little power-plant-in-a-box they want to put literally in your backyard.

You’ll generate your own electricity with the box and it’ll be wireless. The idea is to one day replace the big power plants and transmission line grid, the way the laptop moved in on the desktop and cell phones supplanted landlines.

It has a lot of smart people believing and buzzing, even though the company has been unusually secretive–until now.

K.R. Sridhar invited “60 Minutes” correspondent Lesley Stahl for a first look at the innards of the Bloom box that he has been toiling on for nearly a decade.

Source

Apple creates ‘explicit’ category for App Store software

Though it is not yet in use, Apple has added a category for developers to label their applications as “explicit” software in the App Store for the iPhone and iPod touch.

A developer revealed to Cult of Mac that the new category is available for selection on the iTunesConnect Web site. However, applications with the “explicit” distinction have not yet appeared in the App Store.

The change could signal that Apple is preparing to launch an adults-only section of the App Store that would segregate potentially offensive content from the remainder of applications.

The move follows Apple’s removal of more than 5,000 applications the company said were “overtly sexual.” The change in policy came after the company received numerous complaints from users who were concerned children would be able to access inappropriate content from the App Store on their iPhone or iPod touch. Whether those applications removed in the last week would be allowed in to the App Store under the new “explicit” category is unknown.

Apple is also preparing to launch its iPad device, a new form factor the company will pitch as a multimedia accessory that can serve as an e-reader of novels and textbooks. The new hardware will also have access to the App Store and its library of more than 140,000 applications. Its potential adoption in the education market could have played a part in Apple’s decision to remove sexual content.

Though Apple purged a number of applications (including some mistakenly), other adult oriented content remained on the App Store, including applications from Playboy and Phil Schiller, head of worldwide product marketing for Apple, told The New York Times that his company had decided that well-known, established brands would be allowed to remain on the App Store.

Source

Toshiba Develops 1TB SSD That Fits On A Postage Stamp

SSDs are still overpriced for most average consumers, but the companies responsible for making them are constantly searching for ways to make them larger (in terms of capacity), smaller (in terms of form factor) and cheaper (in terms of real dollars). Toshiba has their own line of solid state drives right now, but just as the company has innovated in the optical storage department, they’re also hoping to innovate in the world of NAND storage.

A new partnership between the company and Tokyo’s Keio University has led to the creation of a new technology that could allow SSDs up to 1TB in size to be made “with a footprint no larger than a postage stamp.” That’s far, far smaller than even the 1.8″ drives that currently reside in the larger iPod units, and exponentially smaller than the 2.5″ SSDs that are shipping now for existing notebooks.

The report states that the two have been able to integrate 128GB NAND Flash chips and a single controller into a stamp-sized form factor. They have even made it operational with transfer rates of 2Gbps (or 250MB/sec) with data transfer that relies on short-range, electromagnetic communication. Somehow, they even claim to have made it 70% more power efficient than the average 2.5″ SSD, making it cheaper to operate as well. The company expects to be able to produce a proof of concept application-ready version sometime in 2012. The main issue right now is that there’s no industry standard in place for this type of technology, so it could be difficult to gain acceptance from PC makers and the like. Of course devices will get smaller as time goes on, and we could easily see this being the go-to drive for the next generation of portable media players and possibly even netbooks. Unfortunately, there’s no mention of a consumer product release date just yet, but we’re guessing it’ll be a few years still.

Regardless, it’s easy to see where the industry is going with Solid State Drive technologies. Eventually, with the level of resource behind its development, storage as we know it will transition completely over to the SSD, similar to the way of the vacuum tube transistor so many years ago.

Source