Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Do Supercomputers Lead To Downtime?

December 3, 2012 by  
Filed under Computing

Comments Off on Do Supercomputers Lead To Downtime?

As supercomputers grow more powerful, they’ll also become more susceptible to failure, thanks to the increased amount of built-in componentry. A few researchers at the recent SC12 conference, held last week in Salt Lake City, offered possible solutions to this growing problem.

Today’s high-performance computing (HPC) systems can have 100,000 nodes or more — with each node built from multiple components of memory, processors, buses and other circuitry. Statistically speaking, all these components will fail at some point, and they halt operations when they do so, said David Fiala, a Ph.D student at the North Carolina State University, during a talk at SC12.

The problem is not a new one, of course. When Lawrence Livermore National Laboratory’s 600-node ASCI (Accelerated Strategic Computing Initiative) White supercomputer went online in 2001, it had a mean time between failures (MTBF) of only five hours, thanks in part to component failures. Later tuning efforts had improved ASCI White’s MTBF to 55 hours, Fiala said.

But as the number of supercomputer nodes grows, so will the problem. “Something has to be done about this. It will get worse as we move to exascale,” Fiala said, referring to how supercomputers of the next decade are expected to have 10 times the computational power that today’s models do.

Today’s techniques for dealing with system failure may not scale very well, Fiala said. He cited checkpointing, in which a running program is temporarily halted and its state is saved to disk. Should the program then crash, the system is able to restart the job from the last checkpoint.

The problem with checkpointing, according to Fiala, is that as the number of nodes grows, the amount of system overhead needed to do checkpointing grows as well — and grows at an exponential rate. On a 100,000-node supercomputer, for example, only about 35 percent of the activity will be involved in conducting work. The rest will be taken up by checkpointing and — should a system fail — recovery operations, Fiala estimated.

Because of all the additional hardware needed for exascale systems, which could be built from a million or more components, system reliability will have to be improved by 100 times in order to keep to the same MTBF that today’s supercomputers enjoy, Fiala said.

Fiala presented technology that he and fellow researchers developed that may help improve reliability. The technology addresses the problem of silent data corruption, when systems make undetected errors writing data to disk.

Source…

SpyEye Poses Risk To Banking Defenses

August 1, 2011 by  
Filed under Internet

Comments Off on SpyEye Poses Risk To Banking Defenses

Financial institutions are facing more trouble from SpyEye, a piece of malicious software that steals money from customers online bank accounts, according to new research from security vendor Trusteer.

SpyEye is a dastardly piece of malicious software: it can harvest credentials for online accounts and also initiate transactions as a person is logged into their account, literally making it possible to watch their bank balance drop by the second.

In its latest versions, SpyEye has been modified with new code designed to evade advanced systems banks have put in place to try and block fraudulent transactions, said Mickey Boodai, Trusteer’s CEO.

Banks are now analyzing how a person uses their site, looking at parameters such as how many pages a person looks at on the site, the amount of time a person spends on a page and the time it takes a person to execute a transaction. Other indicators include IP address, such as if a person who normally logs in from the Miami area suddenly logs in from St. Petersburg, Russia.

SpyEye works fast, and can automatically and quickly initiate a transaction much faster than an average person manually on the website. That’s a key trigger for banks to block a transaction. So SpyEye’s authors are now trying to mimic — albeit in an automated way — how a real person would navigate a website.

Read More…..

80% Of Browsers Found To Be At Risk Of Attack

February 17, 2011 by  
Filed under Internet

Comments Off on 80% Of Browsers Found To Be At Risk Of Attack

About eight out of every ten internet browsers run by consumers are vulnerable to attack by exploits of already-patched bugs, a security expert said today.

The poor state of browser patching stunned Wolfgang Kandek, CTO of security risk and compliance management provider Qualys, which presented data from the company’s free BrowserCheck service Wednesday at the RSA Conference in San Francisco.

“I really thought it would be lower,” said Kandek of the nearly 80% of browsers that lacked one or more patches.

BrowserCheck scans Windows, Mac and Linux machines for vulnerable browsers, as well as up to 18 browser plug-ins, including Adobe’s Flash and Reader, Oracle’s Java and Microsoft’s Silverlight and Windows Media Player.

When browsers and their plug-ins are tabulated together, between 90% and 65% of all consumer systems scanned with BrowserCheck since June 2010 reported at least one out-of-date component, depending on the month. In January 2011, about 80% of the machines were vulnerable.  Read more….

« Previous Page