Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Are Quantum Computers On The Horizon?

March 18, 2016 by  
Filed under Computing

Comments Off on Are Quantum Computers On The Horizon?

Massachusetts Institute of Technology (MIT) and Austria’s University of Innsbruck claim to have put together a working quantum computer capable of solving a simple mathematical problem.

The architecture they have devised ought to be relatively easy to scale, and could therefore form the basis of workable quantum computers in the future – with a bit of “engineering effort” and “an enormous amount of money”, according to Isaac Chuang, professor of physics, electrical engineering and computer science at MIT.

Chuang’s team has put together a prototype comprising the first five quantum bits (or qubits) of a quantum computer. This is being tested on mathematical factoring problems, which could have implications for applications that use factoring as the basis for encryption to keep information, including credit card details, secure.

The proof-of-concept has been applied only to the number 15, but the researchers claim that this is the “first scalable implementation” of quantum computing to solve Shor’s algorithm, a quantum algorithm that can quickly calculate the prime factors of large numbers.

“The team was able to keep the quantum system stable by holding the atoms in an ion trap, where they removed an electron from each atom, thereby charging it. They then held each atom in place with an electric field,” explained MIT.

Chuang added: “That way, we know exactly where that atom is in space. Then we do that with another atom, a few microns away – [a distance] about 100th the width of a human hair.

“By having a number of these atoms together, they can still interact with each other because they’re charged. That interaction lets us perform logic gates, which allow us to realise the primitives of the Shor factoring algorithm. The gates we perform can work on any of these kinds of atoms, no matter how large we make the system.”

Chuang is a pioneer in the field of quantum computing. He designed a quantum computer in 2001 based on one molecule that could be held in ‘superposition’ and manipulated with nuclear magnetic resonance to factor the number 15.

The results represented the first experimental realisation of Shor’s algorithm. But the system wasn’t scalable as it became more difficult to control as more atoms were added.

However, the architecture that Chuang and his team have put together is, he believes, highly scalable and will enable the team to build quantum computing devices capable of solving much bigger mathematical factors.

“It might still cost an enormous amount of money to build, [and] you won’t be building a quantum computer and putting it on your desktop anytime soon, but now it’s much more an engineering effort and not a basic physics question,” said Chuang.

In other quantum computing news this week, the UK government has promised £200m to support engineering and physical sciences PhD students and fuel UK research into quantum technologies, although most of the cash will be spent on Doctoral Training Partnerships rather than trying to build workable quantum computing prototypes.

Courtesy-TheInq

Courtesy-TheInq

ARM Goes 4K With Mali

February 5, 2016 by  
Filed under Computing

Comments Off on ARM Goes 4K With Mali

ARM has announced a new mobile graphics chip, the Mali-DP650 which it said was designed to handle 4K content a device’s screen and on an external display.

The new Mali GPU can push enough pixels on the local display it is more likely that it is interested in using the technology for streaming.

Many smartphones can record 4K video and this means that smartphones could be a home to high resolution content which can be streamed to a large, high resolution screen.

It looks like Mali DP650can juggle the device’s native resolution and the external display’s own resolution and the variable refresh rates. At least that is what ARM says it can do.

The GPU is naturally able to handle different resolutions but it is optimized for a “2.5K”, which means WQXGA (2560×1600) on tablets and WQHD (2560×1440) on smartphones, but also Full HD (1920×1080) for slightly lower end devices.

Mark Dickinson, general manager, media processing group, ARM said: “The Mali-DP650 display processor will enable mobile screens with multiple composition layers, for graphics and video, at Full HD (1920×1080 pixels) resolutions and beyond while maintaining excellent picture quality and extending battery life,”

“Smartphones and tablets are increasingly becoming content passports, allowing people to securely download content once and carry it to view on whichever screen is most suitable. The ability to stream the best quality content from a mobile device to any screen is an important capability ARM Mali display technology delivers.”

ARM did not say when the Mali-DP650 will be in the shops or which chips will be the first to incorporate its split-display mode feature.

Courtesy-Fud

Seagate Goes 8TB For Surveillance

November 13, 2015 by  
Filed under Computing

Comments Off on Seagate Goes 8TB For Surveillance

Seagate has become the first hard drive company to create an 8TB unit aimed specifically at the surveillance market, targeting system integrators, end users and system installers.

The Seagate Surveillance HDD, as those wags in marketing have named it, is the highest capacity of any specialist drive for security camera set-ups, and Seagate cites its main selling points as maximizing uptime while removing the need for excess support.

“Seagate has worked closely with the top surveillance manufacturers to evolve the features of our Surveillance HDD products and deliver a customized solution that has precisely matched market needs in this evolving space for the last 10 years,” said Matt Rutledge, Seagate’s senior vice president for client storage.

“With HD recordings now standard for surveillance applications, Seagate’s Surveillance HDD product line has been designed to support these extreme workloads with ease and is capable of a 180TB/year workload, three times that of a standard desktop drive.

“It also includes surveillance-optimized firmware to support up to 64 cameras and is the only product in the industry that can support surveillance solutions, from single-bay DVRs to large multi-bay NVR systems.”

The 3.5in drive is designed to run 24/7 and is able to capture 800 hours of high-definition video from up to 64 cameras simultaneously, making it ideal for shopping centers, urban areas, industrial complexes and anywhere else you need to feel simultaneously safe and violated. Its capacity will allow 6PB in a 42U rack.

Included in the deal is the Seagate Rescue Service, capable of restoring lost data in two weeks if circumstances permit, and sold with end users in mind for whom an IT support infrastructure is either non-existent or off-site. The service has a 90 percent success rate and is available as part of the drive cost for the first three years.

Seagate demonstrated the drive today at the China Public Security Expo. Where better than the home of civil liberty infringement to show off the new drive?

Earlier this year, Seagate announced a new co-venture with SSD manufacturer Micron, which will come as a huge relief after the recent merger announcement between WD and SanDisk.

Courtesy-http://www.thegurureview.net/computing-category/seagate-goes-8tb-for-surveillance.html

ARM’s Mali GPU Going To Wearables

November 2, 2015 by  
Filed under Computing

Comments Off on ARM’s Mali GPU Going To Wearables

ARM has announced the Mali-470 GPU targeted at Internet of Things (IoT) and wearable devices.

The new Mali-470 GPU has half the power consumption and two times the energy efficiency of the Mali-400, and is designed for next-generation wearables and IoT devices such as industrial control panels and healthcare monitors that rely on low-cost and low-power chips.

The Mali-470 supports OpenGL ES 2.0, used by Android and Android Wear, hinting that the GPU could also find its way into low-cost smartphones. If not, ARM promises that the chip will bring smartphone-quality visuals to wearable and IoT devices, supporting screen resolutions of up to 640×640 on single-core devices, and higher resolutions for multi-core configurations.

ARM envisions the new GPU paired with its efficient Cortex-A7 or A53 CPU designs for a low-power SoC.

“ARM scrutinises every milliwatt across the entire SoC to enable OEMs to optimize energy efficiency and open up new opportunities,” said Mark Dickinson, vice president and general manager of ARM’s multimedia processing group.

“Tuning efficiency is particularly relevant for devices requiring sophisticated graphics on a low power budget such as wearables, entry-level smartphones and IoT devices. The Mali-470 has been designed to meet this demand by enabling a highly capable user interface while being extremely energy efficient.”

ARM expects the first SoCs using the GPU be ready by the end of 2016, meaning that the chip will start showing up in devices the following year.

The launch of the Mali-470 GPU comes just hours after ARM announced plans to pick up the product portfolio and other business assets of Carbon Design Systems, a supplier of cycle-accurate virtual prototyping solutions.

The deal will see Carbon’s staff transfer to ARM, where the chip firm will make use of the Massachusetts-based outfit’s expertise in virtual prototypes. This will enable ARM to iron out any bugs and make improvements to chips before they move to foundries for production.

ARM also said that Carbon will help the firm enhance its capability in SoC architectural exploration, system analysis and software bring-up.

Courtesy-TheInq

Will A.I. Create The Next Industrial Revolution?

June 2, 2015 by  
Filed under Computing

Comments Off on Will A.I. Create The Next Industrial Revolution?

Artificial Intelligence will be responsible for the next industrial revolution, experts in the field have claimed, as intelligent computer systems replace certain human-operated jobs.

Four computer science experts talked about how advances in AI could lead to a “hollowing out” of middle-income jobs during a panel debate hosted by ClickSoftware about the future of technology.

“It’s really important that we take AI seriously. It will lead to the fourth industrial revolution and will change the world in ways we cannot predict now,” said AI architect and author George Zarkadakis.

His mention of the “fourth industrial revolution” refers to the computerization of the manufacturing industry.

If the first industrial revolution was the mechanisation of production using water and steam power, followed by the second which introduced mass production with the help of electric power, then the third is what we are currently experiencing: the digital revolution and the use of electronics and IT to further automate production.

The fourth industrial revolution, which is sometimes referred to as Industry 4.0, is the vision of the ‘smart factory’, where cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralized decisions.

These cyber-physical systems communicate and cooperate with each other and humans in real time over the Internet of Things.

Dan O’Hara, professor of cognitive computing at Goldsmiths, University of London, explained that this fourth industrial revolution will not be the same kind of “hollowing out” of jobs that we saw during the last one.

“It [won’t be] manual labour replaced by automation, but it’ll be the hollowing out of middle-income jobs, medium-skilled jobs,” he said.

“The industries that will be affected the most from a replacement with automation are construction, accounts and transport. But the biggest [industry] of all, remembering this is respective to the US, is retail and sales.”

O’Hara added that many large organisations’ biggest expense is people, who already work alongside intelligent computer systems, and this area is most likely to be affected as companies look to reduce costs.

“Anything that’s working on an AI-based system is bound to be very vulnerable to the replacement by AI as it’s easily automated already,” he said.

However, while AI developments in the retail space could lead to the replacement of jobs, it is also rather promising at the same time.

Mark Bishop, professor of cognitive computing at Goldsmiths, highlighted that AI could save businesses money if it becomes smart enough to determine price variants in company spending, for example, scanning through years of an organisation’s invoice database and detecting the cheapest costs and thus saving on outgoings.

While some worry that AI will take over jobs, others have said that they will replace humans altogether.

John Lewis IT chief Paul Coby said earlier this year that the blending of AI and the IoT in the future could signal the end of civilisation as we know it.

Coby explained that the possibilities are already with us in terms of AI and that we ought to think about how “playing with the demons” could be detrimental to our future.

Apple co-founder Steve Wozniak added to previous comments from Stephen Hawking and Elon Musk with claims that “computers are going to take over from humans”.

Woz made his feelings on AI known during an interview with the Australian Financial Review, and agreed with Hawking and Musk that its potential to surpass humans is worrying.

“Computers are going to take over from humans, no question. Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people,” he said.

Source

ARM Sets New mBed Standard

May 29, 2015 by  
Filed under Computing

Comments Off on ARM Sets New mBed Standard

ARM has bought in a new assurance standard to work with embedded devices.

The ARM mbed Enabled program aims to increase the deployment rate of Internet of Things (IoT) products and supporting technologies by giving partners the ability to label them as interoperable mbed-based devices.

Arm said that the accreditation program will cover solutions entering a broad range of developer markets; from silicon and modules to OEM products and innovative cloud services. Accreditation will be free of charge.

ARM Zach Shelby, vice president of IoT business marketing, said that ARM mbed Enabled accreditation will assure the diverse IoT ecosystem that they are using technologies backed up by an expert community of innovators,.

“This will also instill confidence in end markets where interoperability, trust and security standardisation is required to unlock commercial potential.”

Since the ARM mbed IoT Device Platform was announced in October 2014, the mbed Partner ecosystem has continued to grow from the initial 24 launch partners. Today, 8 new partners are being announced including Advantech, Athos, Captiva, Espotel, Maxim Integrated, MegaChips, SmeshLink, and Tieto.

Source

Can MediaTek Bring The Cortex-A72 To Market?

March 31, 2015 by  
Filed under Computing

Comments Off on Can MediaTek Bring The Cortex-A72 To Market?

MediaTek became the first chipmaker to publicly demo a SoC based on ARM’s latest Cortex-A72 CPU core, but the company’s upcoming chip still relies on the old 28nm manufacturing process.

We had a chance to see the upcoming MT8173 in action at the Mobile World Congress a couple of weeks ago.

The next step is to bring the new Cortex-A72 core to a new node and into mobiles. This is what MediaTek is planning to do by the end of the year.

Cortex-A72 smartphone parts coming in Q4

It should be noted that MediaTek’s 8000-series parts are designed for tablets, and the MT8173 is no exception. However, the new core will make its way to smartphone SoCs later this year, as part of the MT679x series.

According to Digitimes Research, MediaTek’s upcoming MT679x chips will utilize a combination of Cortex-A53 and Cortex-A57 cores. It is unclear whether MediaTek will use the planar 20nm node or 16nm FinFET for the new part.

By the looks of it, this chip will replace 32-bit MT6595, which is MediaTek’s most successful high performance part yet, with a few relatively big design wins, including Alcatel, Meizu, Lenovo and Zopo. The new chip will also supplement, and possibly replace the recently introduced MT6795, a 64-bit Cortex-A53/Cortex-A72 part used in the HTC Desire 826.

More questions than answers

Digitimes also claims the MT679x Cortex-A72 parts may be the first MediaTek products to benefit from AMD technology, but details are scarce. We can’t say whether or not the part will use AMD GPU technology, or some HSA voodoo magic. Earlier this month we learned that MediaTek is working with AMD and the latest report appears to confirm our scoop.

The other big question is the node. The chip should launch toward the end of the year, so we probably won’t see any devices prior to Q1 2016. While 28nm is still alive and kicking, by 2016 it will be off the table, at least in this market segment. Previous MediaTek roadmap leaks suggested that the company would transition to 20nm on select parts by the end of the year.

However, we are not entirely sure 20nm will cut it for high-end parts in 2016. Huawei has already moved to 16nm with its latest Kirin 930 SoC, Samsung stunned the world with the 14nm Exynos 7420, and Qualcomm’s upcoming Snapdragon 820 will be a FinFET part as well.

It is obvious that TSMC’s and Samsung’s 20nm nodes will not be used on most, if not all, high-end SoCs next year. With that in mind, it would be logical to expect MediaTek to use a FinFET node as well. On the other hand, depending on the cost, 20nm could still make sense for MediaTek – provided it ends up significantly cheaper than FinFET. While a 20nm chip wouldn’t deliver the same level of power efficiency and performance, with the right price it could find its way to more affordable mid-range devices, or flagships designed by smaller, value-oriented brands (especially those focusing on Chinese and Indian markets).

Source

MediaTek Shows Off New IoT Platform

February 16, 2015 by  
Filed under Computing

Comments Off on MediaTek Shows Off New IoT Platform

MediaTek had announced a new development platform, as part of its MediaTek Labs initiative.

The LinkIt Connect MT7681 platform is based on the MT7681 SoC, designed for simple and affordable WiFi-enabled Internet of Things (IoT) devices. The company also released a software development kit (SDK) and hardware development kit (HDK) for the new platform.

The HDK includes the LinkIt Connect 7681 development board, which features the MT7681 chipset, micro-USB port and pins for various I/O interfaces. The chipset can be used in WiFi station or access point modes.

In station mode, the chip connects to a wireless access point and communicates with web services or cloud servers, which means it could be used to control smart thermostats. However, in access point mode, the chipset can communicate with devices directly, for example to control smart plugs or light bulbs using a smartphone.

“The world is rapidly moving towards connecting every imaginable device in the home, yet developers often have to spend too much effort on making their products Wi-Fi enabled,” said Marc Naddell, VP, MediaTek Labs. “The MediaTek LinkIt Connect 7681 platform simplifies and accelerates this process so that developers can focus on making innovative home IoT products, whatever their skill level.”

MediaTek Labs was launched in September 2014 and its goal is to promote a range of innovative MediaTek platforms, namely frugal devices such as wearables, IoT and home automation hardware.

Source

MediaTek Cuts Xiaomi

December 12, 2014 by  
Filed under Computing

Comments Off on MediaTek Cuts Xiaomi

The dark satanic rumour mill manufactured a hell on earth rumour that MediaTek has stopped supplying chips to Xiaomi.

MediaTek is apparently cross that Xiaomi has been investing in SoC supplier Leadcore Technology. Xiaomi has reportedly reached a deal with Leadcore allowing the phone maker to get access to the chip designer’s technology patents. DigiTimes however suggests that MediaTek has been trying to expand its presence in the mid-range and high-end market segments, but finds Xiaomi’s pricing strategy is disrupting its plans.

MediaTek’s MT6589T, a quad-core 1.5GHz chip, was originally designed to target mid-range and high-end mobile devices. The solution was introduced in Xiaomi’s Redmi smartphone in August 2013. However, prices for the Redmi series have been cut to as low as $114.

Xiaomi is ranked as the third largest smartphone vendor worldwide in the third quarter of 2014. Xiaomi’s shipments for the quarter registered a 211.3 per cent on-year jump boosting its market share to 5.3 per cent from 2.1 per cent during the same period of 2013.

Source

Will The Chip Industry Take Fall?

October 24, 2014 by  
Filed under Computing

Comments Off on Will The Chip Industry Take Fall?

Microchip Technology has managed to scare Wall Street by warning of an industry downturn. This follows rumours that a number of US semiconductor makers with global operations are reducing demand for chips in regions ranging from Asia to Europe.

Microchip Chief Executive Steve Sanghi warned that the correction will spread more broadly across the industry in the near future. Microchip expects to report sales of $546.2 million for its fiscal second quarter ending in September. The company had earlier forecast revenue in a range of $560 million to $575.9 million. Semiconductor companies’ shares are volatile at the best of times and news like this is the sort of thing that investors do not want to hear.

Trading in Intel, whiich is due to report third quarter results tomorrow, was 2.6 times the usual volume. Micron, which makes dynamic random access memory, or DRAM, was the third-most traded name in the options market. All this seems to suggest that the market is a bit spooked and much will depend on what Chipzilla tells the world tomorrow as to whether it goes into a nosedive.

Source

Next Page »