Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Is Intel Losing Interest In Android?

July 20, 2016 by  
Filed under Computing

Comments Off on Is Intel Losing Interest In Android?

Intel’s mobexit is gathering traction with Inteldeciding to slash its Android development.

While the outfit is still claiming that it is chums with Google, it is now saying that its Android development for tablets is the latest thing it is not interested in.

Intel has been cutting back on its Android upgrades for tablet hardware which suggests it is not interested. Instead it is working on 2-in-1s, which run mostly on Windows.

Intel’s x86 version of Android was mainly for devices with Atom processors, which the chip maker is phasing out. The replacement is Apollo Lake which will run Windows, but it is unclear if it will ever support Android.

The last Android which worked on Intel gear was Android 5.1.1, Lollipop, Intel-based mobile devices mostly run Android 5.0 or older versions.

What might keep Intel in Android might not be its own commitment to Android, but the fact that Google is keen to make its OS work with x86 chips. Google has said that Android 7.0 Nougat, will be compatible with x86 machines which will keep Intel in the game – if it wants to be.

PC World has suggested that Intel could offload development to the independent Android-x86 Project last month delivered the Android-x86 6.0 Release Candidate 1.

Intel is still a lead partner in Google’s Brillo. This is an embedded IoT OS with the dash of Android under the bonnet. Brillo works on Intel’s Edison development board, which can be used to make wearables, robots, smart home devices and other IoT gadgets.

But it is pretty clear that Intel is not interested in some of Google’s VR projects like DayDream which are based on Android.

Courtesy-Fud

Does Intel Need GPUs For HPCs

June 15, 2016 by  
Filed under Computing

Comments Off on Does Intel Need GPUs For HPCs

Nvidia might have scored a few wins by touting its GPU’s in the HPC market, but it is starting to lose ground to the co-processor, according to Intel’s Diane Bryant.

In an IDC interview Intel’s data center boss said that Nvidia gained an early lead in the market for accelerated HPC workloads when it positioned its GPUs for that task several years ago. However there is a perception that processors used for machine learning today are GPUs like those from Nvidia and AMD.

Bryant was a bit miffed when she was asked how Intel can compete in this market without a GPU. She said that the general purpose GPU, or GPGPU was just another type of accelerator and not one that’s uniquely suited to machine learning.
It is better to look at Knights Landing which is a coprocessor, but it’s an accelerator for floating point operations, and that’s what a GPGPU too.

She said that since the release of the first Xeon Phi in 2014, Intel now clawed back 33 percent of the market for HPC workloads that use a floating point accelerator.

“So we’ve won share against Nvidia, and we’ll continue to win share,” she said.

She said that Intel’s share of the machine learning business may be much smaller, but the market is still young.

“Less than one percent of all the servers that shipped last year were applied to machine learning, so to hear Nvidia is beating us in a market that barely exists yet makes me a little crazy,” she says.

Intel will continue to evolve Xeon Phi to make it better at machine learning tasks. She said that there are two aspects to machine learning – training the algorithmic models, and applying those models to the real world in front-end applications. Intel’s FPGAs and its Xeon processors mean Intel has both sides of the equation covered.

But Nvidia’s GPUs are harder for programmers to work with which could give Intel an edge as ordinary businesses need to adopt machine learning. Knights Landing is “self-booting,” which means customers don’t need to pair it with a regular Xeon to boot an OS.
However Intel’s newest Xeon Phi has a floating point performance of about 3 teraflops, which is a little slow compared to the five teraflops for Nvidia’s new GP100.

Courtesy-Fud

Intel’s PC Group Hit The Hardest

June 8, 2016 by  
Filed under Computing

Comments Off on Intel’s PC Group Hit The Hardest

Intel’s restructuring axe seems to be falling on its PC client division and software areas with more than 12000 jobs to go.

Our well-placed sources are confident that the PC group will be the hardest hit. This is all because the PC market has stopped growing and Intel has to find its way to new markets to supplement loss of this business.

Latest research data from IDC indicates that in 2016 PC market will decline from 275.8 million units in 2015 to 260.8 million units in 2016 and the current projections for 2017 show the PC market slightly decreasing to 257.9 million units. At its peak PC market was at 364.0 million units, but this was in 2011 when things were rosier, kids were polite to their parents, and rock stars played decent music. These times are clearly behind us and Intel knows it.

The PC group downsize is being supervised by Dr. Venkata “Murthy” Renduchintala who is Intel’s number two. He is the bloke who was paid $25 million dollars to defect from Qualcomm. Murthy has already done a high level clean up at PC client group and is believed to be thinking about dusting the top of the corporate bookshelf next.

Another team which will be pummeled is Rene James’s old software outfit. People from software services and the security division formerly known as McAfee are expected to mostly go the same way as the artist formerly known as Prince.

Murthy’s also wants to get Intel to the right course with IoT market. Marketing for that area is expected to grow from $655.8 billion in 2014 to $1.7 trillion in 2020. Intel wants the piece of that cake, and perhaps a few tea and biscuits to go with it and it will be interesting to look the fight in this promising land market.

There is still no killer app to help the IoT market which defines it. IoT right now is nothing and everything.

Courtesy-Fud

Intel Looking Into Atomic Energy

May 25, 2016 by  
Filed under Around The Net

Comments Off on Intel Looking Into Atomic Energy

Shortly after cancelling two generations of Atom mobile chips, Intel putting its weight behind future low-power mobile technologies with a new research collaboration with a French atomic energy lab.

Fundamental research leading towards faster wireless networks, secure low-power technologies for the Internet of Things, and even 3D displays will be the focus of Intel’s collaboration with the French Alternative Energies and Atomic Energy Commission (CEA).

Intel and the CEA already work together in the field of high-performance computing, and a new agreement signed Thursday will see Intel fund work at the CEA’s Laboratory for Electronics and Information Technology (LETI) over the next five years, according to Rajeeb Hazra, vice president of Intel’s data center group.

The CEA was founded in 1945 to develop civil and military uses of nuclear power. Its work with Intel began soon after it ceased its atmospheric and underground nuclear weapons test programs, as it turned to computer modeling to continue its weapons research, CEA managing director Daniel Verwaerde said Thursday.

That effort continues, but the organization’s research interests today are more wide-ranging, encompassing materials science, climate, health, renewable energy, security and electronics.

These last two areas will be at the heart of the new research collaboration, which will see scientists at LETI exchanging information with those at Intel.

Both parties dodged questions about who will have the commercial rights to the fruits of their research, but each said it had protected its rights. The deal took a year to negotiate.

“It’s a balanced agreement,” said Stéphane Siebert, director of CEA Technology, the division of which LETI is a part.

Who owns what from the five-year research collaboration may become a thorny issue, for French taxpayers and Intel shareholders alike, as it will be many years before it becomes clear which technologies or patents are important.

Hazra emphasized the extent to which Intel is dependent on researchers outside the U.S. The company has over 50 laboratories in Europe, four of them specifically pursuing so-called exa-scale computing, systems capable of billions of billions of calculations per second.

Source-http://www.thegurureview.net/mobile-category/intel-look-to-atomic-energy-for-mobile-technologys-future.html

Did Researchers Create Lifetime Batteries?

May 4, 2016 by  
Filed under Around The Net

Comments Off on Did Researchers Create Lifetime Batteries?

Researchers at the University of California at Irvine (UCI) have accidentally – yes, accidentally – discovered a nanowire-based technology that could lead to batteries that can be charged hundreds of thousands of times.

Mya Le Thai, a PhD candidate at the university, explained in a paper published this week that she and her colleagues used nanowires, a material that is several thousand times thinner than a human hair, extremely conductive and has a surface area large enough to support the storage and transfer of electrons.

Nanowires are extremely fragile and don’t usually hold up well to repeated discharging and recharging, or cycling. They expand and grow brittle in a typical lithium-ion battery, but Le Thai’s team fixed this by coating a gold nanowire in a manganese dioxide shell and then placing it in a Plexiglas-like gel to improve its reliability. All by accident.

The breakthrough could lead to laptop, smartphone and tablet batteries that last forever.

Reginald Penner, chairman of UCI’s chemistry department, said: “Mya was playing around and she coated this whole thing with a very thin gel layer and started to cycle it.

“She discovered that just by using this gel she could cycle it hundreds of thousands of times without losing any capacity. That was crazy, because these things typically die in dramatic fashion after 5,000 or 6,000 or 7,000 cycles at most.”

The battery-like structure was tested more than 200,000 times over a three-month span, and the researchers reported no loss of capacity or power.

“The coated electrode holds its shape much better, making it a more reliable option,” Thai said. “This research proves that a nanowire-based battery electrode can have a long lifetime and that we can make these kinds of batteries a reality.”

The breakthrough also paves the way for commercial batteries that could last a lifetime in appliances, cars and spacecraft.

British fuel-cell maker Intelligent Energy Holdings announced earlier this year that it is working on a smartphone battery that will need to be charged only once a week.

Did Researchers Create Batteries That A Lifetime? : :: TheGuruReview.net ::

Courtesy-TheInq

Is Samsung Preparing For A Price War?

April 27, 2016 by  
Filed under Computing

Comments Off on Is Samsung Preparing For A Price War?

Samsung Electronics changing its approach to its memory chip business and focus on market share over profit margins and the industry will suffer, according to one analyst.

Bernstein Research’s senior analyst Mark C. Newman said that the competitive dynamic in the memory chip industry is not as good as we thought due to Samsung’s aggressive and opportunistic behavior. This is analyst speak for Samsung is engaging in a supply and price war with the other big names in the memory chip marking business – SK hynix and Micron.

“Rather than sit back and enjoy elevated profit margins with a 40 percent market share in DRAMs, Samsung is intent on stretching their share to closer to 50 percent,” he said.

Newman said the company is gaining significant market share in the NAND sector.

“Although Samsung cares about profits, their actions have been opportunistic and more aggressive than we predicted at the expense of laggards particularly Micron Technology in DRAMs and SK hynix in NANDs,” he said.

SK hynix is expected to suffer. “In NAND, we see Samsung continuing to stretch their lead in 3D NAND, which will put continued pressure on the rest of the field. SK hynix is one of the two obvious losers.”

Newman said that Samsung’s antics have destroyed the “level of trust” among competitors, perhaps “permanently,” as demand has dropped drastically with PC sales growth down to high single digits in 2015 with this year shaping up to be the same.

“Sales of smartphones, the main savior to memory demand growth have also weakened considerably to single digit growth this year and servers with datacenters are not strong enough to absorb the excess, particularly in DRAM,” Newman said.

He is worried that Samsung could create an oversupply in the industry.

“The oversupply issue is if anything only getting worse, with higher than normal inventories now an even bigger worry. Although we were right about the shrink slowing, thus reducing supply growth, the flip side of this trend is that capital spending and R&D costs are soaring thus putting a dent in memory cost declines,” he said.

China’s potential entry into the market and new technologies will provide further worries “over the longer term.”

“Today’s oversupply situation would become infinitely worse if and when China’s XMC ramps up big amounts of capacity. New memory technologies such as 3D X-point, ReRAM and MRAM stand on the sidelines and threaten to cannibalize part of the mainstream memory market,” he said.

Courtesy-Fud

Can Samsung Beat Intel?

April 25, 2016 by  
Filed under Computing

Comments Off on Can Samsung Beat Intel?

Samsung is closing in on Intel in the semiconductor sector as its market share increased by 0.9 percent when compared to a year earlier.

According to beancounters at IBS, the news comes on the heels of an announcement that the three-month average of the global market for semiconductors ending in February fell 6.2 percent compared with the same figure in 2015, down from a 5.8 percent decline in January.

IBS chief executive Handel Jones said:

“Based on talking to customers about buying patterns, we see softness,” said. “Smartphone sales are slowing, and the composition of the market is changing with about half all chips bought by companies in China who want low-end devices In addition, over the past year memory prices have fallen by nearly half both for DRAMs and NAND-based solid-state drives as vendors try to buy market share, said Jones. “It’s more of a price issue because volumes are up.”

Jones expects softness in the PC market will continue through this year. Demand for chips is rising in automotive and for the emerging Internet of Things, but so far both sectors are relatively small, he added.

Data shows that the gap between the market share of these Intel and Samsung firms is narrowing. In 2012, the gap between Intel and Samsung was 5.3 percent. This narrowed to 4.2 percent in 2013, and is now 3.2 percent in 2015. SK Hynix, which now stands as the third largest semiconductor brand in the world, beat Qualcomm with a market share of 4.8 percent.

Courtesy-Fud

Will Intel’s Xeon Broadwell-EP Hit The Market Soon?

April 12, 2016 by  
Filed under Computing

Comments Off on Will Intel’s Xeon Broadwell-EP Hit The Market Soon?

An Intel press slide has been leaked on the web which means we should be seeing a workstation-grade Xeon “Broadwell-EP” processor in the shops soon.

The slide appeared on the anandtech forums and shows the chip will be branded under the Xeon E5-2600 V4 series and will have at least “20 per cent more cores and last-level cache” than Haswell-EP.  It should be shipping on March 31st.

This CPU is started for an HP workstation, called the HP Z640, which succeeds the Z620.

The Xeon Broadwell-E uses Intel’ s14nm process which means 10-core chips will be available at the price of an 8-core chip from the previous generation.

The slide said that the Broadwell-E will deliver 18 per cent average performance increase over Haswell-EP, as well as support up to 2400MHz DDR4 memory for greater I/O throughput.

This slide follows the news that an 18-core Xeon Broadwell-EP CPU was spotted on eBay, carrying a price tag of $999 US. Dubbed Xeon E5-2600 v4, the chip was listed to feature a base clock speed of 2.2GHz and Turbo frequency of 3GHz, as well as a TDP of 145W, 2.5MB of L3 cache per core, with 45MB LLC cache in total.

Courtesy-Fud

Intel Processors Will Support Vulcan API

March 31, 2016 by  
Filed under Computing

Comments Off on Intel Processors Will Support Vulcan API

Intel is releasing graphics drivers that support the Vulkan 1.0 API for chips running Windows 7, 8 and 10 PCs.

According to Intel the drivers provide beta support for the Vulkan 1.0 API for 6th Generation Intel Core and related processors.

Vulkan 1.0 was introduced last month by industry consortium Khronos Group and is supposed to replace the OpenGL, which was first introduced in 1991 by Silicon Graphics. Vulkan is supposed to exploit powerful GPUs and multicore CPUs, but it is still a long way behind Direct X 12 – at least in its beta condition.

With Intel’s drivers developers will be able to exploit features on Intel GPUs, like the Iris Pro, that are integrated in chips alongside CPUs. Intel’s rival AMD has already released Vulkan drivers for Radeon graphics processors.

Vulkan 1.0 APIs will also work with Linux-based PCs like Steam Machines. Intel has made available open-source Vulkan drivers for Linux PCs running on chips code-named Broadwell and Skylake.

Courtesy-Fud

The Linux Foundation Goes Zephyr

March 4, 2016 by  
Filed under Computing

Comments Off on The Linux Foundation Goes Zephyr

The Linux Foundation has launched its Zephyr Project as part of a cunning plan to create an open source, small footprint, modular, scalable, connected, real-time OS for IoT devices.

While there have been cut-down Linux implementations before the increase in numbers of smart, connected devices has made something a little more specialized more important.

Zephyr is all about minimizing the power, space, and cost budgets of IoT hardware.
For example a cut down Linux needs 200KB of RAM and 1MB of flash, IoT end points, which will often be controlled by tiny microcontrollers.

Zephyr has a small footpoint “microkernel” and an even tinier “nanokernel.” All this enables it to be CPU architecture independent, run on as little as 10KB while being scalable.

It can still support a broad range of wireless and wired technologies and of course is entirely open saucy released under the Apache v2.0 License.

It works on Bluetooth, Bluetooth Low Energy, and IEEE 802.15.4 (6LoWPAN) at the moment and supports x86, ARM, and ARC architectures.

Courtesy-Fud

Next Page »