Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

IBM’s Watson Goes To Africa

February 20, 2014 by  
Filed under Computing

Comments Off on IBM’s Watson Goes To Africa

IBM has detailed plans to apply its Watson supercomputer the critical development issues facing Africa.

The machine is capable of holding more intelligent conversations than most Big Brother contestants, and in 2011 it beat human contestants on the US TV game show Jeopardy.

However, in Africa it will be used to help solve the pressing problems facing the continent such as agricultural patterns and famine relief.

The initiative, named Project Lucy after the earliest human remains discovered on the continent, will take 10 years and is expected to cost $100m.

“I believe it will spur a whole era of innovation for entrepreneurs here,” IBM CEO Ginni Rometty told delegates at a conference on Wednesday.

“Data… needs to be refined. It will determine undisputed winners and losers across every industry.”

The technology will be used to find ways to enable the developing world to leapfrog over stages of development that have hitherto been too expensive.

One example cited was Nigeria, where two companies have already committed to use Project Lucy to analyse the poorly maintained road system and determine project priorities for repair.

IBM recently announced that it will invest $1bn to spin off Watson into a separate business unit, however this could be quite a gamble as Reuters reported that although Watson has proved to be a quantum leap, it has yet to make any significant money for the company, netting less than $100m in the past three years.

Source

App Stores For Supercomputers Enroute

December 13, 2013 by  
Filed under Computing

Comments Off on App Stores For Supercomputers Enroute

A major problem facing supercomputing is that the firms that could benefit most from the technology, aren’t using it. It is a dilemma.

Supercomputer-based visualization and simulation tools could allow a company to create, test and prototype products in virtual environments. Couple this virtualization capability with a 3-D printer, and a company would revolutionize its manufacturing.

But licensing fees for the software needed to simulate wind tunnels, ovens, welds and other processes are expensive, and the tools require large multicore systems and skilled engineers to use them.

One possible solution: taking an HPC process and converting it into an app.

This is how it might work: A manufacturer designing a part to reduce drag on an 18-wheel truck could upload a CAD file, plug in some parameters, hit start and let it use 128 cores of the Ohio Supercomputer Center’s (OSC) 8,500 core system. The cost would likely be anywhere from $200 to $500 for a 6,000 CPU hour run, or about 48 hours, to simulate the process and package the results up in a report.

Testing that 18-wheeler in a physical wind tunnel could cost as much $100,000.

Alan Chalker, the director of the OSC’s AweSim program, uses that example to explain what his organization is trying to do. The new group has some $6.5 million from government and private groups, including consumer products giant Procter & Gamble, to find ways to bring HPC to manufacturers via an app store.

The app store is slated to open at the end of the first quarter of next year, with one app and several tools that have been ported for the Web. The plan is to eventually spin-off AweSim into a private firm, and populate the app store with thousands of apps.

Tom Lange, director of modeling and simulation in P&G’s corporate R&D group, said he hopes that AweSim’s tools will be used for the company’s supply chain.

The software industry model is based on selling licenses, which for an HPC application can cost $50,000 a year, said Lange. That price is well out of the reach of small manufacturers interested in fixing just one problem. “What they really want is an app,” he said.

Lange said P&G has worked with supply chain partners on HPC issues, but it can be difficult because of the complexities of the relationship.

“The small supplier doesn’t want to be beholden to P&G,” said Lange. “They have an independent business and they want to be independent and they should be.”

That’s one of the reasons he likes AweSim.

AweSim will use some open source HPC tools in its apps, and are also working on agreements with major HPC software vendors to make parts of their tools available through an app.

Chalker said software vendors are interested in working with AweSim because it’s a way to get to a market that’s inaccessible today. The vendors could get some licensing fees for an app and a potential customer for larger, more expensive apps in the future.

AweSim is an outgrowth of the Blue Collar Computing initiative that started at OSC in the mid-2000s with goals similar to AweSim’s. But that program required that users purchase a lot of costly consulting work. The app store’s approach is to minimize cost, and the need for consulting help, as much as possible.

Chalker has a half dozen apps already built, including one used in the truck example. The OSC is building a software development kit to make it possible for others to build them as well. One goal is to eventually enable other supercomputing centers to provide compute capacity for the apps.

AweSim will charge users a fixed rate for CPUs, covering just the costs, and will provide consulting expertise where it is needed. Consulting fees may raise the bill for users, but Chalker said it usually wouldn’t be more than a few thousand dollars, a lot less than hiring a full-time computer scientist.

The AweSim team expects that many app users, a mechanical engineer for instance, will know enough to work with an app without the help of a computational fluid dynamics expert.

Lange says that manufacturers understand that producing domestically rather than overseas requires making products better, being innovative and not wasting resources. “You have to be committed to innovate what you make, and you have to commit to innovating how you make it,” said Lange, who sees HPC as a path to get there.

Source

Will Computer Obtain Common Sense?

December 10, 2013 by  
Filed under Computing

Comments Off on Will Computer Obtain Common Sense?

Even though it may appear PCs are getting dumbed down as we see constant images of cats playing the piano or dogs playing in the snow, one computer is doing the same and getting smarter and smarter.

A computer cluster running the so-called the Never Ending Image Learner at Carnegie Mellon University runs 24 hours a day, 7 days a week searching the Internet for images, studying them on its own and building a visual database. The process, scientists say, is giving the computer an increasing amount of common sense.

“Images are the best way to learn visual properties,” said Abhinav Gupta, assistant research professor in Carnegie Mellon’s Robotics Institute. “Images also include a lot of common sense information about the world. People learn this by themselves and, with [this program], we hope that computers will do so as well.”

The computers have been running the program since late July, analyzing some three million images. The system has identified 1,500 types of objects in half a million images and 1,200 types of scenes in hundreds of thousands of images, according to the university.

The program has connected the dots to learn 2,500 associations from thousands of instances.

Thanks to advances in computer vision that enable software to identify and label objects found in images and recognize colors, materials and positioning, the Carnegie Mellon cluster is better understanding the visual world with each image it analyzes.

The program also is set up to enable a computer to make common sense associations, like buildings are vertical instead of lying on their sides, people eat food, and cars are found on roads. All the things that people take for granted, the computers now are learning without being told.

“People don’t always know how or what to teach computers,” said Abhinav Shrivastava, a robotics Ph.D. student at CMU and a lead researcher on the program. “But humans are good at telling computers when they are wrong.”

He noted, for instance, that a human might need to tell the computer that pink isn’t just the name of a singer but also is the name of a color.

While previous computer scientists have tried to “teach” computers about different real-world associations, compiling structured data for them, the job has always been far too vast to tackle successfully. CMU noted that Facebook alone has more than 200 billion images.

The only way for computers to scan enough images to understand the visual world is to let them do it on their own.

“What we have learned in the last five to 10 years of computer vision research is that the more data you have, the better computer vision becomes,” Gupta said.

CMU’s computer learning program is supported by Google and the Office of Naval Research.

Source

Researchers Build Flying Robot

December 4, 2013 by  
Filed under Around The Net

Comments Off on Researchers Build Flying Robot

Researchers say they have assembled a flying robot. It’s not designed to fly like a bird or an insect, but was built to simulate the movements of a swimming jellyfish.

Scientists at New York University say they built the small, flying vehicle to move like the boneless, pulsating, water-dwelling jellyfish.

Leif Ristroph, a post-doctoral student at NYU and a lead researcher on the project, explained that previous flying robots were based on the flight of birds or insects, such as flies.

Last spring, for example, Harvard University researchers announced that they had built an insect-like robot that flies by flapping its wings. The flying robot is so small it has about 1/30th the weight of a U.S. penny.

Before the Harvard work was announced, researchers at the University of Sheffield and the University of Sussex in England worked together to study thebrains of honey bees in an attempt to build an autonomous flying robot.

By creating models of the systems in a bee’s brain that control vision and sense of smell, scientists hope to build a robot that would be able to sense and act as autonomously as a bee.

The problem with those designs, though, is that the flapping wing of a fly is inherently unstable, Ristroph noted.

“To stay in flight and to maneuver, a fly must constantly monitor its environment to sense every gust of wind or approaching predator, adjusting its flying motion to respond within fractions of a second,” Ristroph said. “To recreate that sort of complex control in a mechanical device — and to squeeze it into a small robotic frame — is extremely difficult.”

To get beyond those challenges, Ristroph built a prototype robot that is 8 centimeters wide and weighs two grams. The robot flies by flapping four wings arranged like petals on a flower that pulsate up and down, resembling the flying motion of a moth.

The machine, according to NYU, can hover and fly in a particular direction.

There is more work still to be done. Ristroph reported that his prototype doesn’t have a battery but is attached to an external power source. It also can’t steer, either autonomously or via remote control.

Source

Can Robots Run On (NH2)2CO?

November 19, 2013 by  
Filed under Around The Net

Comments Off on Can Robots Run On (NH2)2CO?

Scientists have discovered a way to power future robots using an unusual source — urine.

Researchers at the University of the West of England, Bristol and the University of Bristol collaborated to build a system that will enable robots to function without batteries or being plugged into an electrical outlet.

Based on the functioning of the human heart, the system is designed to pump urine into the robot’s “engine room,” converting the waste into electricity and enabling the robot to function completely on its own.

Scientists are hoping the system, which can hold 24.5 ml of urine, could be used to power future generations of robots, or what they’re calling EcoBots.

“In the city environment, they could re-charge using urine from urinals in public lavatories,” said Peter Walters, a researcher with the University of the West of England. “In rural environments, liquid waste effluent could be collected from farms.”

In the past 10 years, researchers have built four generations of EcoBots, each able to use microorganisms to digest the waste material and generate electricity from it, the university said.

Along with using human and animal urine, the robotic system also can create power by using rotten fruit and vegetables, dead flies, waste water and sludge.

Ioannis Ieropoulos, a scientist with the Bristol Robotics Laboratory, explained that the microorganisms work inside microbial fuel cells where they metabolize the organics, converting them into carbon dioxide and electricity.

Like the human heart, the robotic system works by using artificial muscles that compress a soft area in the center of the device, forcing fluid to be expelled through an outlet and delivered to the fuel cells. The artificial muscles then relax and go through the process again for the next cycle.

“The artificial heartbeat is mechanically simpler than a conventional electric motor-driven pump by virtue of the fact that it employs artificial muscle fibers to create the pumping action, rather than an electric motor, which is by comparison a more complex mechanical assembly,” Walter said.

Source

Stanford Develops Carbon Nanotubes

October 17, 2013 by  
Filed under Computing

Comments Off on Stanford Develops Carbon Nanotubes

Researchers at Stanford University have demonstrated the first functional computer constructed using only carbon nanotube transistors.

Scientists have been experimenting with transistors based on carbon nanotubes, or CNTs, as substitutes for silicon transistors, which may soon hit their physical limits.

The rudimentary CNT computer is said to run a simple operating system capable of multitasking, according to a synopsis of an article published in the journal Nature.

Made of 178 transistors, each containing between 10 and 200 carbon nanotubes, the computer can do four tasks summarized as instruction fetch, data fetch, arithmetic operation and write-back, and run two different programs concurrently.

The research team was led by Stanford professors Subhasish Mitra and H.S. Philip Wong.

“People have been talking about a new era of carbon nanotube electronics moving beyond silicon,” Mitra said in a statement. “But there have been few demonstrations of complete digital systems using [the] technology. Here is the proof.”

IBM last October said its scientists had placed more than 10,000 transistors made of nano-size tubes of carbon on a single chip. Previous efforts had yielded chips with just a few hundred carbon nanotubes.

Source

IBM Goes Linux

September 27, 2013 by  
Filed under Computing

Comments Off on IBM Goes Linux

IBM reportedly will invest $1bn in Linux and other open source technologies for its Power system servers.

The firm is expected to announce the news at the Linuxcon 2013 conference in New Orleans, pledging to spend $1bn over five years on Linux and related open source technologies.

The software technology will be used on IBM’s Power line of servers, which are based on the chip technology of the same name and used for running large scale systems in data centres.

Previously IBM Power systems have mostly run IBM’s proprietary AIX version of Unix, though some used in high performance computing (HPC) configurations have run Linux.

If true, this will make the second time IBM coughs up a $1bn investment in Linux. IBM gave the open source OS the same vote of confidence around 13 years ago.

According to the Wall Street Journal, IBM isn’t investing in Linux to convert its existing AIX customers, but instead Linux will help support data centre applications driving big data, cloud computing and analytics.

“We continue to take share in Unix, but it’s just not growing as fast as Linux,” said IBM VP of Power development Brad McCredie.

The $1bn is expected to go mainly for facilities and staffing to help Power system users move to Linux, with a new centre being opened in France especially to help manage that transition.

Full details are planned to be announced at Linuxcon later today.

Last month, IBM swallowed Israeli security firm Trusteer to boost its customers’ cyber defences with the company’s anti-hacking technology.

Announcing that it had signed a definitive agreement with Trusteer to create a security lab in Israel, IBM said it planned to focus on mobile and application security, counter-fraud and malware detection staffed by 200 Trusteer and IBM researchers.

Source

MIT Develops Inflatable Antenna

September 17, 2013 by  
Filed under Around The Net

Comments Off on MIT Develops Inflatable Antenna

Satellites the size of shoe boxes, which are expected to one day allow researchers to explore space more efficiently, will soon have greater range.

MIT researchers have built and tested an inflatable antenna that can fold into such a satellite, then inflate in orbit to enable long range communications — from seven times the distance possible today.

The technology will let the small satellites, called CubeSats, move further into space and send valuable information to scientists back on earth.

“With this antenna, you could transmit from the moon, and even farther than that,” said Alessandra Babuscia, a researcher on the inflatable antenna team at MIT, in a statement. “This antenna is one of the cheapest and most economical solutions to the problem of communication. But all this research builds a set of options to allow the spacecraft … to fly in deep space.”

The MIT effort comes as engineers at the University of Michigan work on ways to propel such small spacecraft into interplanetary space. The team is building a plasma thruster that could fit in a 10-centimeter space and push a small satellite-bearing spacecraft into deep space.

The university researchers using superheated plasma that would push through a magnetic field to propel a CubeSat.

The MIT researchers are seeking to solve the communications problems and enable far-afield CubeSats to send data to and receive instructions from Earth.

The CubeSat devices cannot support radio dishes that are used today to let spacecraft communicate when far from Earth’s orbit.

The inflatable antennas significantly amplifies radio signals, allowing a CubeSat to transmit data back to Earth at a higher rate, according to the university.

MIT engineers have built two prototype antennae, each a meter wide, out of Mylar, which is a polyester film known for its strength and use as an electric insulator. One antenna was a cone shape, while the other looks more like a cylinder when inflated. Each fits into a 10-cubic-centimeter space within a CubeSat.

Each prototype contains a few grams of benzoic acid, which can be converted to a gas to inflate the antenna, MIT noted.

In testing, the cylindrical antenna performed “slightly better” than the cone shaped device, transmitting data 10 times faster, and seven times farther than existing CubeSat antennae.

Source

Dell Promises ExaScale By 2015

June 17, 2013 by  
Filed under Computing

Comments Off on Dell Promises ExaScale By 2015

Dell has claimed it will make exascale computing available by 2015, as the firm enters the high performance computing (HPC) market.

Speaking at the firm’s Enterprise Forum in San Jose, Sam Greenblatt, chief architect of Dell’s Enterprise Solutions Group, said the firm will have exascale systems by 2015, ahead of rival vendors. However, he added that development will not be boosted by a doubling in processor performance, saying Moore’s Law is no longer valid and is actually presenting a barrier for vendors.

“It’s not doubling every two years any more, it has flattened out significantly,” he said. According to Greenblatt, the only way firms can achieve exascale computing is through clustering. “We have to design servers that can actually get us to exascale. The only way you can do it is to use a form of clustering, which is getting multiple parallel processes going,” he said.

Not only did Greenblatt warn that hardware will have to be packaged differently to reach exascale performance, he said that programmers will also need to change. “This is going to be an area that’s really great, but the problem is you never programmed for this area, you programmed to that old Von Neumann machine.”

According to Greenblatt, shifting of data will also be cut down, a move that he said will lead to network latency being less of a performance issue.”Things are going to change very dramatically, your data is going to get bigger, processing power is going to get bigger and network latency is going to start to diminish, because we can’t move all this [data] through the pipe,” he said.

Greenblatt’s reference to data being closer to the processor is a nod to the increasing volume of data that is being handled. While HPC networking firms such as Mellanox and Emulex are increasing bandwidths on their respective switch gear, bandwidth increases are being outpaced by the growth in the size of datasets used by firms deploying analytics workloads or academic research.

That Dell is projecting 2015 for the arrival of exascale clusters is at least a few years sooner than firms such as Intel, Cray and HP, all of which have put a “by 2020″ timeframe on the challenge. However what Greenblatt did not mention is the projected power efficiency of Dell’s 2015 exascale cluster, something that will be critical to its usability.

Source

« Previous Page