Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Graphene May Give Processors A Boost

June 28, 2016 by  
Filed under Computing

Comments Off on Graphene May Give Processors A Boost

Researchers at MIT have figured out that graphene, sheets of atom-thick carbon, could be used to make chips a million times faster.

The researchers have worked out that slowing the speed of light to the extent that it moves slower than flowing electrons can create an “optical boom”, the optical equivalent of a sonic boom.

Slowing the speed of light is no mean feat, but the clever folks at MIT managed it by using the honeycomb shape of carbon to slow photons to slow photons to several hundredths of their normal speed in a free space, explained researcher Ido Kaminer.

Meanwhile, the characteristics of graphene speed up electrons to a million metres a second, or around 1/300 of the speed of light in a vacuum.

The optical boom is caused when the electrons passing though the graphene reach the speed of light, effectively breaking its barrier in the carbon honeycomb and causing a shockwave of light.

As electrons move faster than the trapped light, they bleed plasmons, a form of virtual particle that represents the oscillation of electrons on the graphene’s surface.

Effectively, it is the equivalent of turning electricity into light. This is nothing new – Thomas Edison did it a century ago with fluorescent tubes – but it can efficiently and controllably generate plasmons at a scale that works with microchip technology.

The discovery could allow chip components to be made from graphene to enable the creation of light-based circuits. These circuits could be the next step in the evolution of chip and computing technology, as the transfer of data through light is far faster than using electrons in today’s chips, even the fast pixel-pushing ones.

So much faster that it’s “six orders of magnitude higher than what is used in electronics”, according to Kaminer. That’s up to a million times faster in plain English.

“There’s a lot of excitement about graphene because it could be easily integrated with other electronics,” said physics professor Marin Soljačić, a researcher on the project, who is confident that MIT can turn this theoretical experiment into a working system. “I have confidence that it should be doable within one to two years.”

This is a pretty big concept and almost sci-fi stuff, but we’re always keen to see smaller and faster chips. It also shows that the future tech envisioned by the world of sci-fi may not be that far away.

Courtesy-TheInq

Intel Looking Into Atomic Energy

May 25, 2016 by  
Filed under Around The Net

Comments Off on Intel Looking Into Atomic Energy

Shortly after cancelling two generations of Atom mobile chips, Intel putting its weight behind future low-power mobile technologies with a new research collaboration with a French atomic energy lab.

Fundamental research leading towards faster wireless networks, secure low-power technologies for the Internet of Things, and even 3D displays will be the focus of Intel’s collaboration with the French Alternative Energies and Atomic Energy Commission (CEA).

Intel and the CEA already work together in the field of high-performance computing, and a new agreement signed Thursday will see Intel fund work at the CEA’s Laboratory for Electronics and Information Technology (LETI) over the next five years, according to Rajeeb Hazra, vice president of Intel’s data center group.

The CEA was founded in 1945 to develop civil and military uses of nuclear power. Its work with Intel began soon after it ceased its atmospheric and underground nuclear weapons test programs, as it turned to computer modeling to continue its weapons research, CEA managing director Daniel Verwaerde said Thursday.

That effort continues, but the organization’s research interests today are more wide-ranging, encompassing materials science, climate, health, renewable energy, security and electronics.

These last two areas will be at the heart of the new research collaboration, which will see scientists at LETI exchanging information with those at Intel.

Both parties dodged questions about who will have the commercial rights to the fruits of their research, but each said it had protected its rights. The deal took a year to negotiate.

“It’s a balanced agreement,” said Stéphane Siebert, director of CEA Technology, the division of which LETI is a part.

Who owns what from the five-year research collaboration may become a thorny issue, for French taxpayers and Intel shareholders alike, as it will be many years before it becomes clear which technologies or patents are important.

Hazra emphasized the extent to which Intel is dependent on researchers outside the U.S. The company has over 50 laboratories in Europe, four of them specifically pursuing so-called exa-scale computing, systems capable of billions of billions of calculations per second.

Source-http://www.thegurureview.net/mobile-category/intel-look-to-atomic-energy-for-mobile-technologys-future.html

IBM’s Watson Goes Cybersecurity

May 23, 2016 by  
Filed under Computing

Comments Off on IBM’s Watson Goes Cybersecurity

IBM Security has announced a new year-long research project through which it will partner with eight universities to help train its Watson artificial intelligence system to tackle cybercrime.

Knowledge about threats is often hidden in unstructured sources such as blogs, research reports and documentation, said Kevin Skapinetz, director of strategy for IBM Security.

“Let’s say tomorrow there’s an article about a new type of malware, then a bunch of follow-up blogs,” Skapinetz explained. “Essentially what we’re doing is training Watson not just to understand that those documents exist, but to add context and make connections between them.”

Over the past year, IBM Security’s own experts have been working to teach Watson the “language of cybersecurity,” he said. That’s been accomplished largely by feeding it thousands of documents annotated to help the system understand what a threat is, what it does and what indicators are related, for example.

“You go through the process of annotating documents not just for nouns and verbs, but also what it all means together,” Skapinetz said. “Then Watson can start making associations.”

Now IBM aims to accelerate the training process. This fall, it will begin working with students at universities including California State Polytechnic University at Pomona, Penn State, MIT, New York University and the University of Maryland at Baltimore County along with Canada’s universities of New Brunswick, Ottawa and Waterloo.

Over the course of a year, the program aims to feed up to 15,000 new documents into Watson every month, including threat intelligence reports, cybercrime strategies, threat databases and materials from IBM’s own X-Force research library. X-Force represents 20 years of security research, including details on 8 million spam and phishing attacks and more than 100,000 documented vulnerabilities.

Watson’s natural language processing capabilities will help it make sense of those reams of unstructured data. Its data-mining techniques will help detect outliers, and its graphical presentation tools will help find connections among related data points in different documents, IBM said.

Ultimately, the result will be a cloud service called Watson for Cyber Security that’s designed to provide insights into emerging threats as well as recommendations on how to stop them.

Source-http://www.thegurureview.net/computing-category/ibms-watson-to-get-schooled-on-cybersecurity.html

China Keeps Supercomputing Title

July 24, 2015 by  
Filed under Computing

Comments Off on China Keeps Supercomputing Title

A supercomputer developed by China’s National Defense University still is the fastest publically known computer in the world, while the U.S. is close to an historic low in the latest edition of the closely followed Top 500 supercomputer ranking, which was just published.

The Tianhe-2 computer, based at the National Super Computer Center in Guangzhou, has been on the top of the list for more than two years and its maximum achieved performance of 33,863 teraflops per second is almost double that of the U.S. Department of Energy’s Cray Titan supercomputer, which is at the Oak Ridge National Laboratory in Tennessee.

The IBM Sequoia computer at the Lawrence Livermore National Laboratory in California is the third fastest machine, and fourth on the list is the Fujitsu K computer at Japan’s Advanced Institute for Computational Science. The only new machine to enter the top 10 is the Shaheen II computer of King Abdullah University of Science and Technology in Saudi Arabia, which is ranked seventh.

The Top 500 list, published twice a year to coincide with supercomputer conferences, is closely watched as an indicator of the status of development and investment in high-performance computing around the world. It also provides insights into what technologies are popular among organizations building these machines, but participation is voluntary. It’s quite possible a number of secret supercomputers exist that are not counted in the list.

With 231 machines in the Top 500 list, the U.S. remains the top country in terms of the number of supercomputers, but that’s close to the all-time low of 226 hit in mid-2002. That was right about the time that China began appearing on the list. It rose to claim 76 machines this time last year, but the latest count has China at 37 computers.

The Top 500 list is compiled by supercomputing experts at the University of Mannheim, Germany; the University of Tennessee, Knoxville; and the Department of Energy’s Lawrence Berkeley National Laboratory.

Source

Google Continues A.I. Expansion

November 4, 2014 by  
Filed under Computing

Comments Off on Google Continues A.I. Expansion

Google Inc is growing its artificial intelligence area, hiring more than half a dozen leading academics and experts in the field and announcing a partnership with Oxford University to “accelerate” its efforts.

Google will make a “substantial contribution” to establish a research partnership with Oxford’s computer science and engineering departments, the company said on Thursday regarding its work to develop the intelligence of machines and software, often to emulate human-like intelligence.

Google did not provide any financial details about the partnership, saying only in a post on its blog that it will include a program of student internships and a series of joint lectures and workshops “to share knowledge and expertise.”

Google, which is based in Mountain View, California, is building up its artificial intelligence capabilities as it strives to maintain its dominance in the Internet search market and to develop new products such as robotics and self-driving cars. In January Google acquired artificial intelligence company Deep Mind for $400 million according to media reports.

The new hires will be joining Google’s Deep Mind team, including three artificial intelligence experts whose work has focused on improving computer visual recognition systems. Among that team is Oxford Professor Andrew Zisserman, a three-time winner of the Marr Prize for computer vision.

The four founders of Dark Blue Labs will also be joining Google where they will be will be leading efforts to help machines “better understand what users are saying to them.”

Google said that three of the professors will hold joint appointments at Oxford, continuing to work part time at the university.

Source

Dell Goes Plastic

June 3, 2014 by  
Filed under Computing

Comments Off on Dell Goes Plastic

Dell is manufacturing a line of PCs using plastics obtained by expanding its recycling program.

The company has expanded the hardware take-back program to more places worldwide, aiming to collect and reuse more extracted plastic and metals in PCs, monitors, hardware panels and other products.

Dell’s OptiPlex 3030 all-in-one, which will ship next month, will be the first product of that effort. Starting next year, more laptops, desktops and monitor back-panels will be made using recycled plastic, said Scott O’Connell, director of environmental affairs at Dell. The products will be certified as sustainable by UL (Underwriters Laboratories).

Dell will save money by reusing plastic, but O’Connell did not say whether the savings will be passed on to customers through lower prices. But it will be easier for more people to recycle electronics and Dell will also provide a PC mail-back option, O’Connell said.

Dell’s plan to establish a recycling chain internally could reduce the need for “virgin” plastics, which can be environmentally damaging to make, said Gary Cook, senior IT analyst at Greenpeace International.

Incineration of plastic from disposed computers can be toxic and reusing plastics in new computers or other parts reduces “dirty energy,” Cook said.

“We need to see plastics last longer,” Cook said.

Companies like Apple have helped raise expectations of sustainability in computers and others are following suit, Cook said. PC makers are using more metals in computer chassis and handset makers are using more nonpetroleum plastics.

Dell was criticized last year by Greenpeace for veering away from its carbon-neutral goals and sustainability advocacy. The company ranked 14th among most green IT companies, behind Microsoft, IBM, Hewlett-Packard, Wipro, Fujitsu and Google, among others.

Dell curbed its sustainability strategy when it was trying to go private last year, but has now reinvigorated that effort.

“They are trying to show some initiative,” Cook said.

Source

App Stores For Supercomputers Enroute

December 13, 2013 by  
Filed under Computing

Comments Off on App Stores For Supercomputers Enroute

A major problem facing supercomputing is that the firms that could benefit most from the technology, aren’t using it. It is a dilemma.

Supercomputer-based visualization and simulation tools could allow a company to create, test and prototype products in virtual environments. Couple this virtualization capability with a 3-D printer, and a company would revolutionize its manufacturing.

But licensing fees for the software needed to simulate wind tunnels, ovens, welds and other processes are expensive, and the tools require large multicore systems and skilled engineers to use them.

One possible solution: taking an HPC process and converting it into an app.

This is how it might work: A manufacturer designing a part to reduce drag on an 18-wheel truck could upload a CAD file, plug in some parameters, hit start and let it use 128 cores of the Ohio Supercomputer Center’s (OSC) 8,500 core system. The cost would likely be anywhere from $200 to $500 for a 6,000 CPU hour run, or about 48 hours, to simulate the process and package the results up in a report.

Testing that 18-wheeler in a physical wind tunnel could cost as much $100,000.

Alan Chalker, the director of the OSC’s AweSim program, uses that example to explain what his organization is trying to do. The new group has some $6.5 million from government and private groups, including consumer products giant Procter & Gamble, to find ways to bring HPC to manufacturers via an app store.

The app store is slated to open at the end of the first quarter of next year, with one app and several tools that have been ported for the Web. The plan is to eventually spin-off AweSim into a private firm, and populate the app store with thousands of apps.

Tom Lange, director of modeling and simulation in P&G’s corporate R&D group, said he hopes that AweSim’s tools will be used for the company’s supply chain.

The software industry model is based on selling licenses, which for an HPC application can cost $50,000 a year, said Lange. That price is well out of the reach of small manufacturers interested in fixing just one problem. “What they really want is an app,” he said.

Lange said P&G has worked with supply chain partners on HPC issues, but it can be difficult because of the complexities of the relationship.

“The small supplier doesn’t want to be beholden to P&G,” said Lange. “They have an independent business and they want to be independent and they should be.”

That’s one of the reasons he likes AweSim.

AweSim will use some open source HPC tools in its apps, and are also working on agreements with major HPC software vendors to make parts of their tools available through an app.

Chalker said software vendors are interested in working with AweSim because it’s a way to get to a market that’s inaccessible today. The vendors could get some licensing fees for an app and a potential customer for larger, more expensive apps in the future.

AweSim is an outgrowth of the Blue Collar Computing initiative that started at OSC in the mid-2000s with goals similar to AweSim’s. But that program required that users purchase a lot of costly consulting work. The app store’s approach is to minimize cost, and the need for consulting help, as much as possible.

Chalker has a half dozen apps already built, including one used in the truck example. The OSC is building a software development kit to make it possible for others to build them as well. One goal is to eventually enable other supercomputing centers to provide compute capacity for the apps.

AweSim will charge users a fixed rate for CPUs, covering just the costs, and will provide consulting expertise where it is needed. Consulting fees may raise the bill for users, but Chalker said it usually wouldn’t be more than a few thousand dollars, a lot less than hiring a full-time computer scientist.

The AweSim team expects that many app users, a mechanical engineer for instance, will know enough to work with an app without the help of a computational fluid dynamics expert.

Lange says that manufacturers understand that producing domestically rather than overseas requires making products better, being innovative and not wasting resources. “You have to be committed to innovate what you make, and you have to commit to innovating how you make it,” said Lange, who sees HPC as a path to get there.

Source

Researchers Build Flying Robot

December 4, 2013 by  
Filed under Around The Net

Comments Off on Researchers Build Flying Robot

Researchers say they have assembled a flying robot. It’s not designed to fly like a bird or an insect, but was built to simulate the movements of a swimming jellyfish.

Scientists at New York University say they built the small, flying vehicle to move like the boneless, pulsating, water-dwelling jellyfish.

Leif Ristroph, a post-doctoral student at NYU and a lead researcher on the project, explained that previous flying robots were based on the flight of birds or insects, such as flies.

Last spring, for example, Harvard University researchers announced that they had built an insect-like robot that flies by flapping its wings. The flying robot is so small it has about 1/30th the weight of a U.S. penny.

Before the Harvard work was announced, researchers at the University of Sheffield and the University of Sussex in England worked together to study thebrains of honey bees in an attempt to build an autonomous flying robot.

By creating models of the systems in a bee’s brain that control vision and sense of smell, scientists hope to build a robot that would be able to sense and act as autonomously as a bee.

The problem with those designs, though, is that the flapping wing of a fly is inherently unstable, Ristroph noted.

“To stay in flight and to maneuver, a fly must constantly monitor its environment to sense every gust of wind or approaching predator, adjusting its flying motion to respond within fractions of a second,” Ristroph said. “To recreate that sort of complex control in a mechanical device — and to squeeze it into a small robotic frame — is extremely difficult.”

To get beyond those challenges, Ristroph built a prototype robot that is 8 centimeters wide and weighs two grams. The robot flies by flapping four wings arranged like petals on a flower that pulsate up and down, resembling the flying motion of a moth.

The machine, according to NYU, can hover and fly in a particular direction.

There is more work still to be done. Ristroph reported that his prototype doesn’t have a battery but is attached to an external power source. It also can’t steer, either autonomously or via remote control.

Source

IBM’s Next-gen Transistors Mimick Human Brain

April 17, 2013 by  
Filed under Computing

Comments Off on IBM’s Next-gen Transistors Mimick Human Brain

IBM has discovered a way to make transistors that could be turned into virtual circuitry that mimics how the human brain operates.

The new transistors would be made from strongly correlated materials, such as metal oxides, which researchers say can be used to build more powerful — but less power-hungry — computation circuitry.

“The scaling of conventional-based transistors is nearing an end, after a fantastic run of 50 years,” said Stuart Parkin, an IBM fellow at IBM Research. “We need to consider alternative devices and materials that operate entirely differently.”

Researchers have been trying to find ways of changing conductivity states in strongly correlated materials for years. Parkin’s team is the first to convert metal oxides from an insulated to conductive state by applying oxygen ions to the material. The team recently published details of the work in the journal Science.

In theory, such transistors could mimic how the human brain operates in that “liquids and currents of ions [would be used] to change materials,” Parkin said, noting that “brains can carry out computing operations a million times more efficiently than silicon-based computers.”

Source