Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Graphene May Give Processors A Boost

June 28, 2016 by  
Filed under Computing

Comments Off on Graphene May Give Processors A Boost

Researchers at MIT have figured out that graphene, sheets of atom-thick carbon, could be used to make chips a million times faster.

The researchers have worked out that slowing the speed of light to the extent that it moves slower than flowing electrons can create an “optical boom”, the optical equivalent of a sonic boom.

Slowing the speed of light is no mean feat, but the clever folks at MIT managed it by using the honeycomb shape of carbon to slow photons to slow photons to several hundredths of their normal speed in a free space, explained researcher Ido Kaminer.

Meanwhile, the characteristics of graphene speed up electrons to a million metres a second, or around 1/300 of the speed of light in a vacuum.

The optical boom is caused when the electrons passing though the graphene reach the speed of light, effectively breaking its barrier in the carbon honeycomb and causing a shockwave of light.

As electrons move faster than the trapped light, they bleed plasmons, a form of virtual particle that represents the oscillation of electrons on the graphene’s surface.

Effectively, it is the equivalent of turning electricity into light. This is nothing new – Thomas Edison did it a century ago with fluorescent tubes – but it can efficiently and controllably generate plasmons at a scale that works with microchip technology.

The discovery could allow chip components to be made from graphene to enable the creation of light-based circuits. These circuits could be the next step in the evolution of chip and computing technology, as the transfer of data through light is far faster than using electrons in today’s chips, even the fast pixel-pushing ones.

So much faster that it’s “six orders of magnitude higher than what is used in electronics”, according to Kaminer. That’s up to a million times faster in plain English.

“There’s a lot of excitement about graphene because it could be easily integrated with other electronics,” said physics professor Marin Soljačić, a researcher on the project, who is confident that MIT can turn this theoretical experiment into a working system. “I have confidence that it should be doable within one to two years.”

This is a pretty big concept and almost sci-fi stuff, but we’re always keen to see smaller and faster chips. It also shows that the future tech envisioned by the world of sci-fi may not be that far away.

Courtesy-TheInq

Google Upgrades Voice Search

October 8, 2015 by  
Filed under Around The Net

Comments Off on Google Upgrades Voice Search

Google said it has constructed a better neural network that is making its voice search work faster and better in noisy environments.

“We are happy to announce that our new acoustic models are now used for voice searches and commands in the Google app (on Android and iOS), and for dictation on Android devices,” Google’s Speech Team wrote in a recent  blog post . “In addition to requiring much lower computational resources, the new models are more accurate, robust to noise, and faster to respond to voice search queries.”

In 2013, Google brought the same voice recognition tools that had been working in Google Now to Google Search.

Along with being able to find information on the Internet, Google Voice Search also was able to find information for users in their Gmail, Google Calendar and Google+ accounts.

At the 2013 Google I/O developers conference, Amit Singhai, today a senior vice president and Google Fellow, said the future of search is in voice. For Google, he said, future searches will be more like conversations with your computer or device, which also will be able to give you information before you even ask for it.

The company went on to make it clear that it would continue to focus on voice search.

And this week’s announcement backs that up.

Google explained in its blog post that it has updated the neural network it’s using for voice search. A neural network is a computer system based on the way the human brain and nervous system work. It generally uses many processors operating in parallel.

The improved neural network is able to consume the incoming audio in larger chunks than conventional models without performing as many calculations.

“With this, we drastically reduced computations and made the recognizer much faster,” the team wrote. “We also added artificial noise and reverberation to the training data, making the recognizer more robust to ambient noise.”

Source-http://www.thegurureview.net/aroundnet-category/google-upgrades-voice-search.html

Intel Shows New IoT Platform

December 23, 2014 by  
Filed under Computing

Comments Off on Intel Shows New IoT Platform

Intel showed off a new platform which it claims makes it easier for companies to create Internet-connected smart products using its chips, security and software.

Intel’s platform is like Lego and based on the chipmaker’s components and software for companies to create smart, connected devices. The only difference is that you can’t enact your own Doctor Who scene from it.

Doug Davis, head of Intel’s Internet of Things business, said at a launch event in San Francisco it will make it a doddle to connect to data centres in order analyse data collected from devices’ sensors.

Intel’s chips should compute capability in end-point devices that scale from its highest performance Xeon processor to the Quark family of products.

Intel’s Internet of Things Group had $530 million in revenue in the September quarter. That accounted for just 4 percent of Intel’s total revenue in the quarter, but it grew 14 percent over the previous year, which was faster than the company’s PC business.

Dell, SAP, Tata Consultancy, Accenture and other companies are working with the new reference model, Davis said.

Source

IBM Breaks Big Data Record

February 28, 2014 by  
Filed under Computing

Comments Off on IBM Breaks Big Data Record

IBM Labs claims to have broken a speed record for Big Data, which the company says could help boost internet speeds to 200 to 400Gbps using “extremely low power”.

The scientists achieved the speed record using a prototype device presented at the International Solid-State Circuits Conference (ISSCC) this week in San Francisco.

Apparently the device, which employs analogue-to-digital conversion (ADC) technology, could be used to improve the transfer speed of Big Data between clouds and data centres to four times faster than existing technology.

IBM said its device is fast enough that 160GB – the equivalent of a two-hour 4K ultra-high definition (UHD) movie or 40,000 music tracks – could be downloaded in a few seconds.

The IBM researchers have been developing the technology in collaboration with Swiss research institution Ecole Polytechnique Fédérale de Lausanne (EPFL) to tackle the growing demands of global data traffic.

“As Big Data and internet traffic continues to grow exponentially, future networking standards have to support higher data rates,” the IBM researchers explained, comparing data transfer per day in 1992 of 100GB to today’s two Exabytes per day, a 20 million-fold increase.

“To support the increase in traffic, ultra-fast and energy efficient analogue-to-digital converter (ADC) technology [will] enable complex digital equalisation across long-distance fibre channels.”

An ADC device converts analogue signals to digital, estimating the right combination of zeros and ones to digitally represent the data so it can be stored on computers and analysed for patterns and predictive outcomes.

“For example, scientists will use hundreds of thousands of ADCs to convert the analogue radio signals that originate from the Big Bang 13 billion years ago to digital,” IBM said.

The ADC technology has been developed as part of an international project called Dome, a collaboration between the Netherlands Institute for Radio Astronomy (ASTRON), DOME-South Africa and IBM to build the Square Kilometer Array (SKA), which will be the world’s largest and most sensitive radio telescope when it’s completed.

“The radio data that the SKA collects from deep space is expected to produce 10 times the global internet traffic and the prototype ADC would be an ideal candidate to transport the signals fast and at very low power – a critical requirement considering the thousands of antennas which will be spread over 1,900 miles,” IBM expalined.

IBM Research Systems department manager Dr Martin Schmatz said, “Our ADC supports Institute of Electrical and Electronics Engineers (IEEE) standards for data communication and brings together speed and energy efficiency at 32 nanometers, enabling us to start tackling the largest Big Data applications.”

He said that IBM is developing the technology for its own family of products, ranging from optical and wireline communications to advanced radar systems.

“We are bringing our previous generation of the ADC to market less than 12 months since it was first developed and tested,” Schmatz added, noting that the firm will develop the technology in communications systems such as 400Gbps opticals and advanced radars.

Source

Does The Cloud Need To Standardize?

September 20, 2013 by  
Filed under Computing

Comments Off on Does The Cloud Need To Standardize?

Frank Baitman, the CIO of the U.S. Department of Health and Human Services (HHS), was at the Amazon Web Services conference  praising the company’s services. Baitman’s lecture was on the verge of becoming a long infomercial, when he stepped back and changed direction.

Baitman has reason to speak well of Amazon. As the big government system integrators slept, Amazon rushed in with its cloud model and began selling its services to federal agencies. HHS and Amazon worked together in a real sense.

The agency helped Amazon get an all-important security certification best known by its acronym, FedRAMP, while Amazon moved its health data to the cloud. It was the first large cloud vendor to get this security certification.

“[Amazon] gives us the scalability that we need for health data,” said Baitman.

But then he said that while it would “make things simpler and nicer” to work with Amazon, since they did the groundwork to get Amazon federal authorizations, “we also believe that there are different reasons to go with different vendors.”

Baitman said that HHS will be working with other vendors as it has with Amazon.

“We recognize different solutions are needed for different problems,” said Baitman. “Ultimately we would love to have a competitive environment that brings best value to the taxpayer and keeps vendors innovating.”

To accomplish this, HHS plans to implement a cloud broker model, an intermediary process that can help government entities identify the best cloud approach for a particular workload. That means being able to compare different price points, terms of service and service-level agreements.

To make comparisons possible, Baitman said the vendors will have to “standardize in those areas that we evaluate cloud on.”

The Amazon conference had about 2,500 registered to attend, and judging from the size of the crowd it certainly appeared to have that many at the Washington Convention Center. It was a leap in attendance. In 2012, attendance at Amazon’s government conference was about 900; in 2011, 300 attended; and in 2010, just 50, Teresa Carlson, vice president of worldwide public sector at Amazon, said in an interview.

Source