Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Is TSMC Taking A Fall?

April 28, 2016 by  
Filed under Computing

Comments Off on Is TSMC Taking A Fall?

On Thursday Taiwan Semiconductor Manufacturing Company announced an 18 percent quarterly revenue decline for Q1 2016 from the same timeframe a year ago in Q1 2015. The chip manufacturing giant also announced Q1 2016 net profit of $2 billion USD ($64.78 billion TWD), representing an 8.3 percent quarterly profit decline from the same timeframe a year ago in Q1 2015.

For TSMC, Q1 2016 was marked by a reduction of demand for high-end smartphones, while smartphone demand in China and emerging markets had upward momentum. Beginning Q2 2016 and onward, the company expect to get back onto a growth trajectory and is projected to hit a 5 to 10 percent growth rate in 2016.

“Our 10-nanometer technology development is on track,” said company president and co-CEO Mark Liu during the company’s Q4 2015 earnings call. “We are currently in intensive yield learning mode in our technology development. Our 256-megabit SRAM is yielding well. We expect to complete process and product qualification and begin customer product tape-outs this quarter.”

“Our 7-nanometer technology development progress is on schedule as well. TSMC’s 7 nanometer technology development leverage our 10-nanometer development very effectively. At the same time, TSMC’s 7-nanometer offers a substantial density improvement, performance improvement and power reduction from 10-nanometer.

These two technologies, 10-nanometer and 7-nanometer, will cover a very wide range of applications, including application processors for smartphone, high-end networking, advanced graphics, field-programmable gate arrays, game consoles, wearables and other consumer products.”

In Q1 2016, TSMC reached a gross margin of 44.9 percent, an operating margin of 34.6 percent and a net profit margin of 31.8 percent respectively. Going forward into Q2 2016, the company is expecting revenue between ~$6.65 billion and ~$6.74 billion USD, gross margins between 49 and 51 percent, and operating profit margins between 38.5 and 40.5 percent, respectively.

Chips used for communications and industrial uses represented over 80 percent of TSMC’s revenue in FY 2015. The company was also able to improve its margins by increasing 16-nanometer production, and like many other semiconductor companies, is preparing for an expected upswing sometime in 2017.

In February, a 6.4-magnitude earthquake struck southern Taiwan where TSMC’s 12-inch Fab 14 is located, a current site of 16-nanometer production. The company expected to have a manufacturing impact above 1 percent in the region with a slight reduction in wafer shipments for the quarter.

“Although the February 6 earthquake caused some delay in wafer shipments in the first quarter, we saw business upside resulting from demand increases in mid- and low-end smartphone segments and customer inventory restocking,” said Lora Ho, Senior Vice President and Chief Financial Officer of TSMC.

“We expect our business in the second quarter will benefit from continued inventory restocking and recovery of the delayed shipments from the earthquake.”

In fiscal year 2016, the company will spend between $9 and $10 billion on ramping up the 16-nanometer process node, constructing Fab 15 for 12-inch wafers in Nanjing, China, and beginning commercial production of the 10-nanometer FinFET process at this new facility. Samsung and Intel are also expected to start mass production of 10-nanometer products by the end of 2016.

During its Q4 2015 earnings call, company president and co-CEO Mark Liu stated the company is currently preparing and working on a 7-nanometer process node and plans to begin volume production sometime in 2018. Meanwhile, since January 2015, a separate research and development team at TSMC has been laying the groundwork for a 5-nanometer process which the company expects to bring into commercial production sometime in 1H 2020.

So far in Q1 2016, shipments of 16 and 20-nanometer wafers have accounted for around 23 percent of the company’s total wafer revenues.

Courtesy-Fud

Is Samsung Preparing For A Price War?

April 27, 2016 by  
Filed under Computing

Comments Off on Is Samsung Preparing For A Price War?

Samsung Electronics changing its approach to its memory chip business and focus on market share over profit margins and the industry will suffer, according to one analyst.

Bernstein Research’s senior analyst Mark C. Newman said that the competitive dynamic in the memory chip industry is not as good as we thought due to Samsung’s aggressive and opportunistic behavior. This is analyst speak for Samsung is engaging in a supply and price war with the other big names in the memory chip marking business – SK hynix and Micron.

“Rather than sit back and enjoy elevated profit margins with a 40 percent market share in DRAMs, Samsung is intent on stretching their share to closer to 50 percent,” he said.

Newman said the company is gaining significant market share in the NAND sector.

“Although Samsung cares about profits, their actions have been opportunistic and more aggressive than we predicted at the expense of laggards particularly Micron Technology in DRAMs and SK hynix in NANDs,” he said.

SK hynix is expected to suffer. “In NAND, we see Samsung continuing to stretch their lead in 3D NAND, which will put continued pressure on the rest of the field. SK hynix is one of the two obvious losers.”

Newman said that Samsung’s antics have destroyed the “level of trust” among competitors, perhaps “permanently,” as demand has dropped drastically with PC sales growth down to high single digits in 2015 with this year shaping up to be the same.

“Sales of smartphones, the main savior to memory demand growth have also weakened considerably to single digit growth this year and servers with datacenters are not strong enough to absorb the excess, particularly in DRAM,” Newman said.

He is worried that Samsung could create an oversupply in the industry.

“The oversupply issue is if anything only getting worse, with higher than normal inventories now an even bigger worry. Although we were right about the shrink slowing, thus reducing supply growth, the flip side of this trend is that capital spending and R&D costs are soaring thus putting a dent in memory cost declines,” he said.

China’s potential entry into the market and new technologies will provide further worries “over the longer term.”

“Today’s oversupply situation would become infinitely worse if and when China’s XMC ramps up big amounts of capacity. New memory technologies such as 3D X-point, ReRAM and MRAM stand on the sidelines and threaten to cannibalize part of the mainstream memory market,” he said.

Courtesy-Fud

Is Apple Trying To Rain On Intel’s Parade?

April 5, 2016 by  
Filed under Computing

Comments Off on Is Apple Trying To Rain On Intel’s Parade?

Intel’s cunning plans for computers that will recognize human emotion using its RealSense 3D camera, have been killed off in the short term by Apple.

RealSense is a mix of infrared, laser and optical cameras to measure depth and track motion. It can be used on a drone that can navigate its own way through a city block, but it is also good at detecting changes in facial expressions, and Intel wanted to give RealSense the ability to read human emotions by combining it with an emotion recognition technology developed by Emotient.

Plugging in Emotient allowed RealSense to detect whether people are happy or sad by analyzing movement in their lips, eyes and cheeks. Intel said that it could detect “anger, contempt, disgust, fear,” and other sentiments.

A few months ago the fruity cargo cult Apple acquired Emotient. Intel has removed the Emotient plug-in from the latest version of the RealSense software development kit.

It is not clear at this point if Apple told Intel that it invented the plug in and so it had to sling its hook, or if Intel did not want Jobs’ Mob anywhere near its technology.

The RealSense SDK has features that allow it to recognize some facial expressions, but it’s unclear if they’ll be as effective as the Emotient technology.

Courtesy-Fud

Samsung Shows Off The BGA SSD

April 4, 2016 by  
Filed under Around The Net

Comments Off on Samsung Shows Off The BGA SSD

During Samsung’s 2016 SSD Forum in Japan, the company took the wraps off its first ever ball-grid array (BGA) solid state disk for mobile devices, the PM971. This particular SSD aims to replace module-based M.2 drives in the 2-in-1 hybrid PC market. The company is claiming it will offer improved thermals, up to 10-percent more battery life and a reduction in vertical storage height for OEMs, product designers and system manufacturers.

The Samsung PM971 built using the company’s Photon controller and runs MLC 3D V-NAND (contrary to the picture above, PC Watch claims it is actually 3-bits per cell). The drive will be available in 128GB, 256GB and 512GB storage capacities and will feature sequential reads up to 1,500MB/s, sequential writes up to 600MB/s, random reads up to 190,000 IOPS and random writes up to 150,000 IOPS.In general, SSDs with BGA packaging are considerably smaller than those using the M.2 form factor, and Intel has claimed that using a PCI-E BGA SSD could allow an increase in battery size by around 10-percent compared to using an M.2 2260 SSD (with GPIO using 1.8v power rail instead of 3.3v), lower thermals than M.2 (from BGA ball conduction to motherboard instead of through M.2 mounting screws), and a vertical height savings of 0.5mm to 1.5mm in notebook devices.

The nice thing about BGA SSDs is that they are “complete” storage solutions and integrate NAND flash memory, the NAND controller and DRAM all into a single package. Currently, there are several BGA M.2 form factors being proposed that will make single-chip SSDs a reality sooner than later as the result of a collaboration between HP, Intel, Lenovo, Micron, SanDisk, Seagate and Toshiba. The four BGA SSD packages proposed are Type 1620, Type 2024, Type 2228 and Type 2828, ranging anywhere between 16 x 20 millimeters and 28 x 28 millimeters with up to 2-millimeter vertical height. It is currently unknown whether the Samsung PM971 adopts any of these proposed BGA M.2 standards.

Based on the demonstration at the 2016 Samsung SSD Forum in Japan, the PM971 offers decent performance thanks to a PCI-E 3.0 x4 interface and the company’s new Photon controller. According to the PC Watch website, the drive is physically smaller than an SD card and Samsung expects device manufacturers and OEMs to begin adoption in the second half of 2016 or the first half of 2017.

Courtesy-Fud

Will Razer’s External Graphics Box Fail?

March 30, 2016 by  
Filed under Computing

Comments Off on Will Razer’s External Graphics Box Fail?

We first saw the Razer Core, an external graphics box that connects to a notebook via Thunderbolt 3 port, back at CES 2016 in January, and today, Razer has finally unveiled a bit more details including the price, availability date and compatibility details.

At the GDC 2016 show in San Francisco, Razer has announced that the Core will be ready in April and have a price of US $499. As expected, it has been only validated on Razer Blade Stealth and the newly introduced Razer Blade 2016 Edition notebooks but as it uses Thunderbolt 3 interface, it should be compatible with any other notebook, as long as manufacturer wants it.

With dimensions set at 105 x 353 x 220mm, the Razer Core is reasonably portable. It comes with a 500W PSU and features four USB 3.0 ports, Gigabit Ethernet and Thunderbolt 3 port which is used to connect it to a notebook.

As far as graphics cards support is concerned, Razer says that the Core will work with any AMD Radeon graphics card since Radeon 290 series, including the latest R9 Fury, R9 Nano and Radeon 300 series, as well as pretty much all Nvidia Maxwell GPU based graphics cards since Geforce GTX 750/750 Ti, although we are not sure why would you pair up a US $500 priced box with a US $130 priced graphics cards. The maximum TDP for the graphics card is set at 375W, which means that all dual-GPU solutions are out of the picture, so it will go as far as R9 Fury X or the GTX Titan X.

There aren’t many notebooks that feature a Thunderbolt 3 ports and we have heard before that Thunderbolt 3 might have certain issues with latency, which is probably why other manufacturers like MSI and Alienware, went on with their own proprietary connectors. Of course, Razer probably did the math but we will surely keep a closer eye on it when it ships in April. Both AMD and Nvidia are tweaking their drivers and already have support for external graphics, so it probably will not matter which graphics card you pick.

According to Razer, the Razer Core will be available in April and priced at US $499. Razer is already started taking pre-orders for the Razer Core and offers a US $100 discount in case you buy it with one of their notebooks, Razer Blade 2016 or Blade Stealth.

Courtesy-Fud

MediaTek Shows Off The Helio X25 Chip

March 28, 2016 by  
Filed under Computing

Comments Off on MediaTek Shows Off The Helio X25 Chip

MediaTek has told Fudzilla that the Helio X25 SoC is not only real, but that it is a “turbo” version of the Helio X20.

Meizu is expected to be one of the first companies to use the X25. Last year it was also the first to use MTK 6795T for its Meizu MX5 phone. In that case the “T” suffix stood for Turbo. This phone was 200 MHz faster than the standard Helio X10 “non T” version.

In 2016 is that MediaTek decided to use the new Helio X25 name because of a commercial arrangement. MediaTek didn’t mention any of the partners, but confirmed that the CPU and GPU will be faster. They did not mention specific clock speeds. Below is a diagram of the Helio X20, and we assume that the first “eXtreme performance” cluster will get a frequency boost, as well as the GPU.

The Helio X25 will not have any architectural changes, it is just a faster version of X20, just like MTK 6795T was faster version of MTK 6795. According to the company, the Helio X25 will be available in May.

This three cluster Helio X25 SoC has real potential and should be one of the most advanced mobile solutions when it hits the market.The first leaked scores of the Helio X20 suggest great performance, but the X25 should have even better scores. There should be a dozen design wins with Helio X20/ X25 and most of them are yet to be announced. There should be a few announcements for the Helio X25 soon, but at least we do know that now there will be a even faster version of three cluster processor.

Courtesy-Fud

 

Is The GPU Market Going Down?

March 25, 2016 by  
Filed under Computing

Comments Off on Is The GPU Market Going Down?

The global GPU market has fallen by 20 per cent over the last year.

According to Digitimes it fell to less than 30 million units in 2015 and the outfit suffering most was  AMD. The largest graphics card player Palit Microsystems, which has several brands including Palit and Galaxy, shipped 6.9-7.1 million graphics cards in 2015, down 10 per cent  on year. Asustek Computer shipped 4.5-4.7 million units in 2015, while Colorful shipped 3.9-4.1 million units, and is aiming to raise its shipments by 10 per cent  on year in 2016.

Micro-Star International (MSI) enjoyed healthy graphics card shipments at 3.45-3.55 million in 2015, up 15 per cent  on year, and EVGA, which has tight partnerships with Nvidia, also saw a significant shipment growth, while Gigabyte suffered from a slight drop on year. Sapphire and PowerColor suffered dramatic drops in shipments in 2015.

There are fears that several of the smaller GPU makers could be forced out of the market after AMD gets its act together with the arrival of Zen and Nvidia’s next-generation GPU architectures launch later in 2016.

Courtesy-Fud

Qualcomm Jumps Into VR

March 24, 2016 by  
Filed under Computing

Comments Off on Qualcomm Jumps Into VR

Qualcomm has thrown its hat into the virtual reality (VR) ring with the launch of the Snapdragon VR SDK for Snapdragon-based smartphones and VR headsets.

The SDK gives developers access to advanced VR features, according to Qualcomm, allowing them to simplify development and attain improved performance and power efficiency with Qualcomm’s Snapdragon 820 processor, found in Android smartphones such as the Galaxy S7 and tipped to feature in upcoming VR headsets.

In terms of features, the development kit offers tools such as digital signal processing (DSP) sensor fusion, which allows devs to use the “full breadth” of technologies built into the Snapdragon 820 chip to create more responsive and immersive experiences.

It will help developers combine high-frequency inertial data from gyroscopes and accelerometers, and there’s what the company calls “predictive head position processing” based on its Hexagon DSP, while Qualcomm’s Symphony System Manager makes easier access to power and performance management for more stable frame rates in VR applications running on less-powerful devices.

Fast motion to photon will offer single buffer rendering to reduce latency by up to 50 percent, while stereoscopic rendering with lens correction offers support for 3D binocular vision with color correction and barrel distortion for improved visual quality of graphics and video, enhancing the overall VR experience.

Stereoscopic rendering with lens correction supports 3D binocular vision with color correction and barrel distortion for improved visual quality of graphics and video, enhancing the overall VR experience.

Rounding off the features is VR layering, which improves overlays in a virtual world to reduce distortion.

David Durnil, senior director of engineering at Qualcomm, said: “We’re providing advanced tools and technologies to help developers significantly improve the virtual reality experience for applications like games, 360 degree VR videos and a variety of interactive education and entertainment applications.

“VR represents a new paradigm for how we interact with the world, and we’re excited to help mobile VR developers more efficiently deliver compelling and high-quality experiences on upcoming Snapdragon 820 VR-capable Android smartphones and headsets.”

The Snapdragon VR SDK will be available to developers in the second quarter through the Qualcomm Developer Network.

The launch of Qualcomm’s VR SDK comes just moments after AMD also entered the VR arena with the launch of the Sulon Q, a VR-ready wearable Windows 10 PC.

Courtesy-TheInq

 

Intel Putting RealSense Into VR

March 16, 2016 by  
Filed under Around The Net

Comments Off on Intel Putting RealSense Into VR

Intel is adapting its RealSense depth camera into an augmented reality headset design which it might be licensing to other manufacturers.

The plan is not official yet but appears to have been leaked to the Wall Street Journal. Achin Bhowmik, who oversees RealSense as vice president and general manager of Intel’s perceptual computing group, declined to discuss unannounced development efforts.

But he said Intel has a tradition of creating prototypes for products like laptop computers to help persuade customers to use its components. We have to build the entire experience ourselves before we can convince the ecosystem,” Bhowmik said.

Intel appears to be working on an augmented-reality headset when it teamed up with IonVR to to work on an augmented-reality headset that could work with a variety of operating systems, including Android and iOS. Naturally, it had a front-facing RealSense camera.

RealSense depth camera has been in development for several years and was shown as a viable product technology at the Consumer Electronics Show in 2014. Since then, nothing has happened and Microsoft’s Kinect sensor technology for use with Windows Hello in the Surface Pro 4 and Surface Book knocked it aside.

Intel’s biggest issue is that it is talking about making a consumer product which is something that it never got the hang of.

RealSense technology is really good at translating real-world objects into virtual space. In fact a lot better than the HoloLens because it can scan the user’s hands and translate them into virtual objects that can manipulate other virtual objects.

Courtesy-Fud

The Linux Foundation Goes Zephyr

March 4, 2016 by  
Filed under Computing

Comments Off on The Linux Foundation Goes Zephyr

The Linux Foundation has launched its Zephyr Project as part of a cunning plan to create an open source, small footprint, modular, scalable, connected, real-time OS for IoT devices.

While there have been cut-down Linux implementations before the increase in numbers of smart, connected devices has made something a little more specialized more important.

Zephyr is all about minimizing the power, space, and cost budgets of IoT hardware.
For example a cut down Linux needs 200KB of RAM and 1MB of flash, IoT end points, which will often be controlled by tiny microcontrollers.

Zephyr has a small footpoint “microkernel” and an even tinier “nanokernel.” All this enables it to be CPU architecture independent, run on as little as 10KB while being scalable.

It can still support a broad range of wireless and wired technologies and of course is entirely open saucy released under the Apache v2.0 License.

It works on Bluetooth, Bluetooth Low Energy, and IEEE 802.15.4 (6LoWPAN) at the moment and supports x86, ARM, and ARC architectures.

Courtesy-Fud

« Previous PageNext Page »