Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Are Some IoT Gadgets Pointless?

November 30, 2015 by  
Filed under Around The Net

Comments Off on Are Some IoT Gadgets Pointless?

The man who first coined the term “Internet of Things” (IoT) has hit out at the bastardisation of the concept, calling on UK developers to lead the charge on making it a reality.

In an address on day two of Microsoft’s Future Decoded event in London, Kevin Ashton showed examples of supposed IoT devices such as the wine bottle that tells you if you’re drunk and the toothbrush that tells you if you’ve brushed your teeth.

Describing Kickstarter as “where bad ideas go to get funded”, he talked about the true nature of IoT and its roots in machine-to-machine communication that’s neither accessed nor processed by humans.

“This information isn’t going on a spreadsheet or a pivot table,” he explained. “It’s a sensor on a device in the world sending data to another device which makes a decision which feeds out into the world.”

In short: “We don’t collect data. Machines collect data from sensors and we turn the world into data.”

The perfect example of this is the mobile phone. “We call a phone a phone for legacy reasons,” he said. “A phone is just an app on your device. You probably use Candy Crush or Angry Birds more than you use it for actual calls. What a smartphone actually is, is a wireless sensor platform.”

He said that historically the UK has been at the forefront of internet developments, so it’s only right that the country takes a leading role in the evolution of the IoT.

Citing self-driving cars as a good example of the IoT at work, he predicted that by 2030 such vehicles will be the norm, and that the question should not be “Are self-driving cars safe?” but “Are human-driven cars safe?”, pointing out that 3,000 people are killed on the roads every day by human-driven cars, and so far at least, there have been no serious accidents involving autonomous vehicles.

Courtesy-http://www.thegurureview.net/computing-category/are-some-iot-gadgets-pointless.html

Will The IoT Market Value Reach 330 Billion By 2025?

November 25, 2015 by  
Filed under Around The Net

Comments Off on Will The IoT Market Value Reach 330 Billion By 2025?

Beancounters working for analysts Navigant Research have added up some numbers and divided by their shoe size and decided that global revenues from residential IoT devices expected to total more than $330 billion by 2025.

These are devices like smart thermostats that allow users to remotely control household temperatures or LED lights that can be switched on and off from a smartphone. Basically it is the same thing as the IoT concept in the residential setting.

Navigant Research, global revenue from shipments of residential IoT devices is expected to total more than US$330 billion from 2015-2025. That is a lot of talking fridges and Internet connected underware.

Neil Strother, principal research analyst with Navigant Research said that the IoT is like putting together a jigsaw puzzle without any edge pieces, with the number of pieces growing exponentially into the billions.

“Communicating devices in the IoT traverse a wide range of industries and sectors-virtually all areas of life can expect to see some form of this connected world.”

Despite the many drivers for the residential IoT market, there are at present multiple protocols and standards that are creating an interoperability barrier, he said.
Wi-Fi, ZigBee, Bluetooth, and others are all vying for market viability, which is creating confusion for consumers and stalling overall adoption, he said.

Courtesy-Fud

Seagate Goes 8TB For Surveillance

November 13, 2015 by  
Filed under Computing

Comments Off on Seagate Goes 8TB For Surveillance

Seagate has become the first hard drive company to create an 8TB unit aimed specifically at the surveillance market, targeting system integrators, end users and system installers.

The Seagate Surveillance HDD, as those wags in marketing have named it, is the highest capacity of any specialist drive for security camera set-ups, and Seagate cites its main selling points as maximizing uptime while removing the need for excess support.

“Seagate has worked closely with the top surveillance manufacturers to evolve the features of our Surveillance HDD products and deliver a customized solution that has precisely matched market needs in this evolving space for the last 10 years,” said Matt Rutledge, Seagate’s senior vice president for client storage.

“With HD recordings now standard for surveillance applications, Seagate’s Surveillance HDD product line has been designed to support these extreme workloads with ease and is capable of a 180TB/year workload, three times that of a standard desktop drive.

“It also includes surveillance-optimized firmware to support up to 64 cameras and is the only product in the industry that can support surveillance solutions, from single-bay DVRs to large multi-bay NVR systems.”

The 3.5in drive is designed to run 24/7 and is able to capture 800 hours of high-definition video from up to 64 cameras simultaneously, making it ideal for shopping centers, urban areas, industrial complexes and anywhere else you need to feel simultaneously safe and violated. Its capacity will allow 6PB in a 42U rack.

Included in the deal is the Seagate Rescue Service, capable of restoring lost data in two weeks if circumstances permit, and sold with end users in mind for whom an IT support infrastructure is either non-existent or off-site. The service has a 90 percent success rate and is available as part of the drive cost for the first three years.

Seagate demonstrated the drive today at the China Public Security Expo. Where better than the home of civil liberty infringement to show off the new drive?

Earlier this year, Seagate announced a new co-venture with SSD manufacturer Micron, which will come as a huge relief after the recent merger announcement between WD and SanDisk.

Courtesy-http://www.thegurureview.net/computing-category/seagate-goes-8tb-for-surveillance.html

Oracle’s M7 Processor Has Security On Silicon

November 10, 2015 by  
Filed under Computing

Comments Off on Oracle’s M7 Processor Has Security On Silicon

Oracle started shipping systems based on its latest Sparc M7 processor, which the firm said will go a long way to solving the world’s online security problems by building protection into the silicon.

The Sparc M7 chip was originally unveiled at last year’s Openworld show in San Francisco, and was touted at the time as a Heartbleed-prevention tool.

A year on, and Oracle announced the Oracle SuperCluster M7, along with Sparc T7 and M7 servers, at the show. The servers are all based on the 32-core, 256-thread M7 microprocessor, which offers Security in Silicon for better intrusion protection and encryption, and SQL in Silicon for improved database efficiency.

Along with built-in security, the SuperCluster M7 packs compute, networking and storage hardware with virtualisation, operating system and management software into one giant cloud infrastructure box.

Oracle CTO Larry Ellison was on hand at Openworld on Tuesday to explain why the notion of building security into the silicon is so important.

“We are not winning a lot of these cyber battles. We haven’t lost the war but we’re losing a lot of the battles. We have to rethink how we deliver technology especially as we deliver vast amounts of data to the cloud,” he told delegates.

Ellison said that Oracle’s approach to this cyber war is to take security as low down in the stack as possible.

“Database security is better than application security. You should always push security as low in the stack as possible. At the bottom of the stack is silicon. If all of your data in the database is encrypted, that’s better than having an application code that encrypts your data. If it’s in the database, every application that uses that database inherits that security,” he explained.

“Silicon security is better than OS security. Then every operating system that runs on that silicon inherits that security. And the last time I checked, even the best hackers have not figured out a way to download changes to your microprocessor. You can’t alter the silicon, that’s really tricky.”

Ellison’s big idea is to take software security features out of operating systems, VMs and even databases in some cases – because software can be changed – and instead push them into the silicon, which can’t be. He is also urging for security to be switched on as default, without an option to turn it back off again.

“The security features should always be on. We provide encryption in our databases but it can be switched off. That is a bad idea. There should be no way to turn off encryption. The idea of being able to turn on and off security features makes no sense,” he said.

Ellison referred back to a debate that took place at Oracle when it first came up with its backup system – should the firm have only encrypted backups. “We did a customer survey and customers said no, we don’t want to pay the performance penalty in some cases,” he recalled. “In that case customer choice is a bad idea. Maybe someone will forget to turn on encryption when it should have been turned on and you lose 10 million credit cards.”

The Sparc M7 is basically Oracle’s answer to this dire security situation. Ellison said that while the M7 has lots of software features built into the silicon, the most “charismatic” of these is Silicon Secured Memory, which is “deceptively simple” in how it works.

“Every time a computer program asks for memory, say you ask for 8MB of memory, we compute a key and assign this large number to that 8MB of memory,” he explained. “We take those bits and we lock that memory. We also assign that same number to the program. Every time the program accesses memory, we check that number to make sure it’s the memory you allocated earlier. That compare is done by the hardware.”

If a program tries to access memory belonging to another program, the hardware detects a mismatch and raises a signal, flagging up a possible breach or bug.

“We put always-on memory intrusion detection into the silicon. We’re always looking for Heartbleed and Venom-like violations. You cannot turn it off,” the CTO warned.

“We’ve also speeded up encryption and decompression, which is kind of related to encryption. It runs at memory speed there’s zero cost in doing that. We turn it on, you can’t turn it off, it’s on all the time. It’s all built into the M7.”

Ellison claimed that running M7-based systems will stop threats like Heartbleed and Venom in their tracks.

“The way Venom worked, the floppy disc driver concealed this code. It’s the worst kind of situation, you’re writing into memory you’re not supposed to. You’re writing computer instructions into the memory and you’ve just taken over the whole computer,” he explained. “You can steal and change data. M7 – the second we tried to write that code into memory that didn’t belong to that program, where the keys didn’t match, that would have been detected real-time and that access would have been foiled.

All well and good, except for the fact that nearly every current computer system doesn’t run off the M7 processor. Ellison claimed that even if only three or four percent of servers in the cloud an organisation is using have this feature, they will be protected as they’ll get the early warning to then deal with the issue across non-M7 systems.

“You don’t have to replace every micro processor, you just have to replace a few so you get the information real-time,” he added.

“You’ll see us making more chips based on security, to secure our cloud and to sell to people who want to secure their clouds or who want to have secure computers in their datacentre. Pushing security down into silicon is a very effective way to do that and get ahead of bad guys.”

SuperCluster M7 and Sparc M7 servers are available now. Pricing has not been disclosed but based on normal Oracle hardware costs, expect to dig deep to afford one.

Source-http://www.thegurureview.net/computing-category/oracles-new-m7-processor-has-security-on-silicon.html

Is China The Fastest Growing Market For IoT?

November 5, 2015 by  
Filed under Around The Net

Comments Off on Is China The Fastest Growing Market For IoT?

China’s Internet of Things (IoT) services revenues will grow faster than anywhere else in the world, according to beancounters working at ABI Research.

ABI has added up the numbers and divided by its shoe size and multiplied by the age of its youngest child and worked out that China’s IoT market will grow more than five times in the next five years, exceeding $41 billion by 2020.

Dan Shey, VP and IoT practice director at ABI Research said that driving China’s IoT numbers is the smart meter segment.

“It leads all other segments in both connections and revenues. In fact, by 2020, smart meter connections will exceed the next highest market segment in total connections by nearly 10 to 1.”

Other major segments driving the China IoT market are home security and automation, OEM telematics, video surveillance, home appliances, aftermarket telematics and home monitoring.
Home monitoring is expected to become an important market in China as it attempts to care for its aging population, which will reach nearly 340 million people in 2020 for citizens age 55 and older.

“Data analytics revenues will generate the most IoT revenues in China. This statistic is reflective of the sheer volume of smart meter connections,” Shey said.

This is indicative of the relative lack of revenues in both platform and professional services in the China market.

“Platform revenues are not as high due to, for example, a higher share of proprietary embedded telematics deployments, especially by domestic OEM brands. Professional services revenues are similarly not as high, not only due to fewer connections in the telematics segments, with a higher proportion of tethered solutions, but also because IT and consultancy services are not as mature a market segment as in some of the more developed world markets such as Japan, South Korea and the US,” he wrote.

Source-http://www.thegurureview.net/computing-category/is-china-the-fastest-growing-market-for-iot.html

IBM and Intel Going GoFlo SOI

October 23, 2015 by  
Filed under Computing

Comments Off on IBM and Intel Going GoFlo SOI

Soitec’s CEO and board chairman has raised an eyebrow or two when he said that the iPhone 6s has multiple RF chips built on silicon-on-insulator (SOI) substrates and that Intel and IBM are using the tech for their silicon photonics push.

According to EETimes Paul Boudre, who claimed that SOI is already being used by Apple and Intel even though neither company is broadcasting it. SOI appears to be on track to major market penetration even while the rest of the industry is talking FinFETs.

GlobalFoundries general manager Rutger Wijburg told the SEMICON Europa 2015 that his outfit’s 22-nanometer “22FDX” SOI platform delivers FinFET-like performance but at a much lower power point and at a cost comparable to 28-nanometer planar technologies.

The 300-millimeter $250 million FD-SOI foundry here in the “Silicon Saxony” area of Germany, builds on 20 years of GlobalFoundries’ investments in Europe’s largest semiconductor fabs.

GlobalFoundries said it will extend Moore’s Law by using fully-deleted silicon-on-insulator (FD-SOI) transistors on wafers bought from Soitec.

Many had thought that if GloFlo’s FD-SOI gamble paid off then it would be a while before FinFET would have a serious rival. But Boudre’s claims suggests that SOI is already being used.

Customers like Intel and OEMs supplying fully-deleted silicon-on-insulator (FD-SOI) RF transistors to Apple proves that SOI and Soitec are past the cusp of the growth curve, destined to ramp up exponentially.

The problem for Soitec is no one is really talking about it. Chipzilla is committed to the FinFET, because it is higher performance than FD-SOI, even though it is higher power too.
Boudre said that it was supplying SOI wafers to Intel for other applications that don’t require high-performance. For instance, our wafers are very good for their silicon photonics projects.

Apple is already using SOI for several radio frequency (RF) chips in their front-ends, because they use 20-times less power. The iPhone is still using gallium arsenide (GaAs) for its power amplifier (PA) because it needs the high-power device for good connections, but for other RF front-end chips, and in fact for all the chips that they want to keep “always on,” the lower power consumption of FD-SOI is pushing the smartphone makers to Soitec, Boudre said.

SOI wafers cost three-times as much as bulk silicon but the cost per die is less because of the simplified processing steps including fewer masks.

Normally GPS chips run on 0.8 volts and consume over 20 milliamps, so they must be turned off most of the time. But when they are made with SOI wafers, they can run on 0.4 volts and consume only 1 milliamp. The mobile device to leave them on all the time and new and more accurate location sensing and new kinds of location-based applications can be developed.

What is amusing then is that Intel’s reason for going with FinFETs was that SOI wafers were too expensive but it did find a use for it.

GlobalFoundries’ Saxony fab will offer four varieties of its 22FDX process.

FDX-ulp for the mainstream and low-cost smartphone market. This will use body-biasing to beat FinFETs on power, but equal them in performance.

FDX-uhp for networking applications using analogue integration to match FinFETs while minimizing energy consumption

FDX-ull for ultra-low power required by wearables and Internet of Things applications. This will have a 1 picoamp per micron leakage

DDX-rfa for radio frequency (RF) analogue applications delivering 50 percent lower power and reduced system costs for LTE-A cellular transceivers, high-order multiple-input/multiple-output (MIMO) WiFi combo chips and millimeter wave radar.

Courtesy-http://www.thegurureview.net/computing-category/ibm-and-intel-going-goflo-soi.html

IBM Makes Carbon Nanotube Breakthrough

October 16, 2015 by  
Filed under Computing

Comments Off on IBM Makes Carbon Nanotube Breakthrough

IBM’S research and development department has announced “a major engineering breakthrough” in transistor technology that could transform the mobile device space as we know it, especially wearables.

IBM scientists demonstrated a new way to shrink transistor contacts in chips, thus speeding up the replacing of silicon transistors with carbon nanotubes which the firm has been working on for several years.

The company said that the breakthrough brings it closer to creating fully scaled carbon nanotube technology that will power future computing technologies while increasing performance and “opening a pathway to dramatically faster, smaller and more powerful chips”.

Carbon nanotube chips have many benefits over traditional silicon. Transistors in silicon are approaching a point of physical limitation. They have been made smaller year after year, but shrinking the size of the transistor, including the channels and contacts, without compromising performance is becoming increasingly difficult.

Carbon nanotube chips could improve the capabilities of high-performance computers because they allow these contacts to be so small that they are virtually transparent.

This means that the size of the semiconductor can decrease dramatically, while the substrate of carbon nanotubes makes the chip more energy efficient and is a soft and flexible material that could allow new device form factors.

Shu-jen Han, IBM’s manager of nanoscale science and technology, told us in an interview that wearable technology is one of the most exciting areas that this technology could transform owing to the unique property of the substrate, allowing new form factors with better performance and battery life.

However, the breakthrough isn’t about the carbon nanotube material being a better replacement for silicon, but more of an engineering innovation that addresses part of the problem in successfully rolling out better performing and more efficient chips.

“We know what the issue has been, and the limits of the technology, for years. What we solved here is a device-level issue, a one-dimensional structure. We need to make a wafer of them, a high-quality wafer, which does not exist yet,” Shu-jen said.

The next stage for IBM’s research group is to scale up the carbon nanotube technology to make reliable mass produced chips before they can make a difference to businesses and consumers.

Shu-jen said this could take five to 10 years, but could enable big data to be analysed faster and allow cloud data centres to deliver services more efficiently and economically.

Source-http://www.thegurureview.net/computing-category/ibm-makes-carbon-nanotube-breakthrough.html

Can IBM Beat Moore’s Law?

October 15, 2015 by  
Filed under Computing

Comments Off on Can IBM Beat Moore’s Law?

 

Big Blue Researchers have discovered a way to replace silicon semiconductors with carbon nanotube transistors and think that the development will push the industry past Moore’s law limits.

IBM said its researchers successfully shrunk transistor contacts in a way that didn’t limit the power of carbon nanotube devices. The chips could be smaller and faster and significantly surpass what’s possible with today’s silicon semiconductors.

The chips are made from carbon nanotubes consist of single atomic sheets of carbon in rolled-up tubes. This means that high-performance computers may well be capable of analysing big data faster, and battery life and the power of mobile and connected devices will be better. The advance may enable cloud-based data centres to provide more efficient services, IBM claims.

Moore’s law, which has for years governed the ability of the semiconductor industry to double the processing power of chips every 24 months is starting to reach the limits of physics when it comes to doubling the power of silicon chips. This could mean a slowing of significant computing performance boosts unless someone comes up with something fast.

IBM researchers claim to have proved that carbon nanotube transistors can work as switches at widths of 10,000 times thinner than a human hair, and less than half the size of the most advanced silicon technology.

The latest research has overcome “the other major hurdle in incorporating carbon nanotubes into semiconductor devices which could result in smaller chips with greater performance and lower power consumption,” IBM said.

Electrons found in carbon transistors move more efficiently than those that are silicon-based, even as the extremely thin bodies of carbon nanotubes offer more advantages at the atomic scale, IBM says.

The new research is jump-starting the move to a post-silicon future, and paying off on $3 billion in chip research and development investment IBM announced in 2014.

Source-http://www.thegurureview.net/computing-category/can-ibm-beat-moores-law.html

IBM And ARM Team Up For IoT

September 15, 2015 by  
Filed under Computing

Comments Off on IBM And ARM Team Up For IoT

IBM is teaming up with ARM to offer device and risk management for the internet of things.

For those who came in late, IBM has an IoT Foundation cloud platform. Under the deal it will be linked to ARM’s mbed-enabled devices to deliver analytics services.

It is a little odd given that both of them make and design chipsets, but they think that the fusion will enable far more data produced by autonomous IoT devices to be gathered, analysed and acted on.

Products powered by ARM’s mbed chips will automatically register with the IoT Foundation on the SoftLayer infrastructure is built and connect with IBM’s cloud analytics services.

IoT Foundation already includes analytics tools designed to cope with the big data explosion, access to IBM’s Bluemix platform as a service, and security systems.

ARM said that connecting the two would enable delivery of actionable events to control equipment, or alerts and information to users, such as alarm messages on domestic appliances.

Source-http://www.thegurureview.net/computing-category/ibm-and-arm-are-teaming-up-for-iot.html

Oracle’s New Processor Goes For The Cheap

August 13, 2015 by  
Filed under Computing

Comments Off on Oracle’s New Processor Goes For The Cheap

Oracle is looking to expand the market for its Sparc-based servers with a new, low-cost processor which it curiously called Sonoma.

The company isn’t saying yet when the chip will be in the shops but the spec shows that could become a new rival for Intel’s Xeon chips and make Oracle’s servers more competitive.

Sonoma is named after a place where they make cheap terrible Californian wine  and Oracle aims the chip at Sparc-based servers at “significantly lower price points” than now.

This means that companies can use them for smaller, less critical applications.

Oracle has not done much with its Sparc line-up for a couple of years, and Sonoma was one of a few new chips planned. The database maker will update its Sparc T5, used in its mid-range systems and the high-end Sparc M7. The technology is expected to filter to the Sonoma lower tier servers.

The Sparc M7 will have technologies for encryption acceleration and memory protection built into the chip. It will include coprocessors to speed up database performance.

According to IDG Sonoma will take those same technologies and bring them down to low-cost points. This means that people can use them in cloud computing and for smaller applications.

He didn’t talk about prices or say how much cheaper the new Sparc systems will be, and it could potentially be years before Sonoma comes to market.

Source

« Previous PageNext Page »