Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

nVidia NVLINK 2.0 Going In IBM Servers

August 31, 2016 by  
Filed under Computing

On Monday, PCWorld reported that the first servers expected to use Nvidia’s second-generation NVLINK 2.0 technology will be arriving sometime next year using IBM’s upcoming Power9 chip family.

IBM launched its Power8 lineup of superscalar symmetric multiprocessors back in August 2013 at the Hot Chips conference, and the first systems became available in August 2014. The announcement was significant because it signaled the beginning of a continuing partnership between IBM and Nvidia to develop GPU-accelerated IBM server systems, beginning with the Tesla K40 GPU.

The result was an HPC “tag-team” where IBM’s Power8 architecture, a 12-core chip with 96MB of embedded memory, would eventually go on to power Nvidia’s next-generation Pascal architecture which debuted in April 2016 at the company’s GPU Technology Conference.

NVLINK, first announced in March 2014, uses a proprietary High-Speed Signaling interconnect (NVHS) developed by Nvidia. The company says NVHS transmits data over a differential pair running at up to 20Gbps, so eight of these differential 20Gbps connections will form a 160Gbps “Sub-Link” that sends data in one direction. Two sub-links—one for each direction—will form a 320Gbps, or 40GB/s bi-directional “Link” that connects processors together in a mesh framework (GPU-to-GPU or GPU-to-CPU).

NVLINK lanes upgrade from 20Gbps to 25Gbps

IBM is projecting its Power9 servers to be available beginning in the middle of 2017, with PCWorld reporting that the new processor lineup will include support for NVLINK 2.0 technology. Each NVLINK lane will communicate at 25Gbps, up from 20Gbps in the first iteration. With eight differential lanes, this translates to a 400Gbps (50GB/s) bi-directional link between CPUs and GPUs, or about 25 percent more performance if the information is correct.

NVLINK 2.0 capable servers arriving next year

Meanwhile, Nvidia has yet to release any NVLINK 2.0-capable GPUs, but a company presentation slide in Korean language suggests that the technology will first appear in Volta GPUs which are also scheduled for release sometime next year. We were originally under the impression that the new GPU architecture would release in 2018, as per Nvidia’s roadmap. But a source hinted last month that Volta would be getting 16nm FinFET treatment and may show up in roughly the same timeframe as AMD’s HBM 2.0-powered Vega sometime in 2017. After all, it is easier for Nvidia to launch sooner if the new architecture is built on the same node as the Pascal lineup.

Still ahead of PCI-Express 4.0

Nvidia claims that PCI-Express 3.0 (32GB/s with x16 bandwidth) significantly limits a GPU’s ability to access a CPU’s memory system and is about “four to five times slower” than its proprietary standard. Even PCI-Express 4.0, releasing later in 2017, is limited to 64GB/s on a slot with x16 bandwidth.

To put this in perspective, Nvidia’s Tesla P100 Accelerator uses four 40GB/s NVLINK ports to connect clusters of GPUs and CPUs, for a total of 160GB/s of bandwidth.

With a generational NVLINK upgrade from 40GB/s to 50GB/s bi-directional links, the company could release a future Volta-based GPU with four 50GB/s NVLINK ports totaling of 200GB/s of bandwidth, well above and beyond the specifications of the new PCI-Express standard.

Courtesy-Fud

August 29, 2016 by  
Filed under Around The Net

MIT researchers have uncovered a way to transfer wireless data using a smartphone at a speed about three times faster and twice as far as existing technology.

The researchers developed a technique to coordinate multiple wireless transmitters by synchronizing their wave phases, according to a statement from MIT on Tuesday. Multiple independent transmitters will be able to send data over the same wireless channel to multiple independent receivers without interfering with each other.

Since wireless spectrum is scarce, and network congestion is only expected to grow, the technology could have important implications.

The researchers called the approach MegaMIMO 2.0 (Multiple Input, Multiple Output) .

For their experiments, the researchers set up four laptops in a conference room setting, allowing signals to roam over 802.11 a/g/n Wi-Fi. The speed and distance improvement is expected to also apply to cellular networks. A video describes the technology as well as a technical paper (registration required), which was presented this week to the Association for Computing Machinery’s Special Interest Group on Data Communications (SIGCOMM 16).

The researchers, from MIT’s Computer Science and Artificial Intelligence Lab, are: Ezzeldin Hamed, Hariharan Rahul, Mohammed Abdelghany and Dina Katabi.

Courtesy-http://www.thegurureview.net/mobile-category/mit-researchers-develop-technique-to-triple-wireless-speeds.html

Courtesy-http://www.thegurureview.net/mobile-category/mit-researchers-develop-technique-to-triple-wireless-speeds.html

August 26, 2016 by  
Filed under Around The Net

Apple is trying to convince the world it is “coming up with something new” by talking a lot about Artificial Reality.

It is a fairly logical development, the company has operated a reality distortion field to create an alternative universe where its products are new and revolutionary and light years ahead of everyone else’s. It will be curious to see how Apple integrates its reality with the real world, given that it is having a problem with that.

Apple CEO Tim Cook has been doing his best to convince the world that Apple really is working on something. He needs to do this as the iPhone cash cow starts to dry up and Jobs Mob appears to have no products to replace it.

In an interview with The Washington Post published Sunday, Cook said Apple is “doing a lot of things” with augmented reality (AR), the technology that puts digital images on top of the real world.
He said:

“I think AR is extremely interesting and sort of a core technology. So, yes, it’s something we’re doing a lot of things on behind that curtain we talked about.”

However Apple is light years behind working being done by Microsoft with its Microsoft’s HoloLens headset and the startup Magic Leap’s so-called cinematic reality that’s being developed now.

Cook appears to retreat to AR whenever he is under pressure. But so far he has never actually said that the company is developing any.

Appple has also snapped up several companies and experts in the AR space. And in January, the Financial Times claimed that the company has a division of hundreds of people researching the technology.
But AR would be a hard fit to get a product out which fits Apple’s ethos and certainly not one for years. Meanwhile it is unlikely we will see anything new before Microsoft and Google get their products out.

Courtesy-Fud

 

August 24, 2016 by  
Filed under Around The Net

CVS has rolled out its CVS Pay program that exists inside its mobile app. It allows customers to pay in store for prescriptions by scanning a barcode at the register.

Payments will be backed by a customer’s credit or debit card, the company said.

CVS Pay is currently available in New York, New Jersey, Pennsylvania and Delaware; a nationwide rollout at all 9,600 stores is expected to kick off later this year.

CVS doesn’t support Apple Pay or other NFC-based payment technologies, and its use of barcodes for payments is reminiscent of the way Starbucks customers pay for coffee. Working with the barcode technology was a faster way for CVS to bring forward technology for more convenient in-store payments, analysts said.

Other retailers have created in-store payments through their own apps. Walmart created Walmart Pay in December to allow payments through mobile device QR codes that can be read at checkout registers.

“There’s nothing really innovative here with CVS Pay,” said Gartner analyst Avivah Litan on Friday. “They are pretty much following the trend. It’s just mobile commerce with a credit card attached. It’s no big deal to put a credit card in a wallet.”

At one point, CVS was working with Walmart and dozens of other major retailers in the Merchant Customer Exchange, which was designed to process mobile payments electronically through bank accounts and not credit cards to cut out the card processing cost that merchants paid to banks. But MCX ended its pilot of its mobile app, CurrentC, in June. Analysts have predicted the concept will not continue.

Source-http://www.thegurureview.net/mobile-category/cvs-debuts-cvs-pay.html

August 22, 2016 by  
Filed under Computing

Of course, it is that time of the year. Apple, Qualcomm, MediaTek and now Samsung will have 10nm SoCs ready for  phones in early 2017. Of course Samsung wants to use its own 10nm SoC in the Galaxy S8 that is expected in late February 2017, but probably with a mix of 10nm Snapdragon too.

Samsung’s next generation Exynos’ name is very uninspired. You don’t call your much better chip just the Exynos 8895, but that might not be the final name.

The Korean giant went from Exynos 7420 for Galaxy S5 and first 14nm for Android followed a year after with Exynos 8890 still 14nm but witha  custom Exynos M1 “Mongoose” plus Cortex-A53eight core combination.

The new SoC is rumored to come with a 4GHz clock. The same leak suggests that the Snapdragon 830 can reach 3.6 GHz which would be quite an increase from the 2.15Ghz that the company gets with the Snapdragon 820. Samsung’s Exynos 8890 stops at 2.6GHz with one or two cores running while it drops to 2.3 GHz when three of four cores from the main cluster run. Calls us sceptics for this 4GHz number as it sounds like quite a leap from the previous generation.

Let us remind ourselves that the clock speed is quite irrelevant as it doesn’t mean anything, and is almost as irrelevant as an Antutu score. It tells you the maximal clock of a SoC but you really want to know the performance per watt or how much TFlops you can expect in the best case. A clock speed without knowing the architecture is insufficient to make any analysis. We’ve seen in the past that 4GHz processors were slower than 2.5GHz processors.

The fact that Samsung continued to use Snapdragon 820 for its latest greatest Galaxy Note 7 means that the company still needs Qualcomm and we don’t think this is going to change anytime soon. Qualcomm traditionally has a better quality modem tailored well for USA, China, Japan and even the complex Europe or the rest of the world.

Courtesy-Fud

August 19, 2016 by  
Filed under Computing

Intel is acquiring deep-learning startup Nervana Systems in a deal that could help it make up for lost ground in the increasingly hot area of artificial intelligence.

Founded in 2014, California-based Nervana offers a hosted platform for deep learning that’s optimized “from algorithms down to silicon” to solve machine-learning problems, the startup says.

Businesses can use its Nervana cloud service to build and deploy applications that make use of deep learning, a branch of AI used for tasks like image recognition and uncovering patterns in large amounts of data.

Also of interest to Intel, Nervana is developing a specialty processor, known as an ASIC, that’s custom built for deep learning.

Financial terms of the deal were not disclosed, but one estimate put the value above $350 million.

“We will apply Nervana’s software expertise to further optimize the Intel Math Kernel Library and its integration into industry standard frameworks,” Diane Bryant, head of Intel’s Data Center Group, said in a blog post. Nervana’s expertise “will advance Intel’s AI portfolio and enhance the deep-learning performance and TCO of our Intel Xeon and Intel Xeon Phi processors.”

Though Intel also acquired AI firm Saffron late last year, the Nervana acquisition “clearly defines the start of Intel’s AI portfolio,” said Paul Teich, principal analyst with Tirias Research.

“Intel has been chasing high-performance computing very effectively, but their hardware-design teams missed the convolutional neural network transition a few years ago,” Teich said. CNNs are what’s fueling the current surge in artificial intelligence, deep learning and machine learning.

As part of Intel, Nervana will continue to operate out of its San Diego headquarters, cofounder and CEO Naveen Rao said in a blog post.

The startup’s 48-person team will join Intel’s Data Center Group after the deal’s close, which is expected “very soon,” Intel said.

Source- http://www.thegurureview.net/aroundnet-category/intel-to-acquire-deep-learning-company-nervana.html

August 17, 2016 by  
Filed under Consumer Electronics

The driverless car market is expected to grow to $42 billion by 2025 and Nvidia has a cunning plan to grab as much of that market as possible with its current automotive partnerships.

The company started to take in more cash from its car business recently. The company earned $113 million from its automotive segment in fiscal Q1 2017. While that is not much it represents a 47 percent increase over the year before. Automotive revenue up to about 8.6 percent of total revenue and it is set to get higher.

BMW, Tesla, Honda and Volkswagen are all using Nvidia gear in one way or another.

BMW’s been using Nvidia infotainment systems for years and seems to have been Nvidia’s way into the industry. Tesla has a 17 inch touchscreen display of which is powered by Nvidia. You can see Tesla’s all-digital 12.3-inch instrument cluster display uses Nvidia GPUs. Honda has Tegra processors for its Honda Connect infotainment system.

But rumors are that Nvidia is hoping to make a killing from the move to driverless cars. The company is already on the second version of its Drive PX self-driving platform. Nvidia claims that Drive PX recently learned how to navigate 3,000 miles of road in just 72 hours.

BMW, Ford, and Daimler are testing Drive PX and Audi used Nvidia’s GPUs to help pilot some of its self-driving vehicles in the past. In fact Audi has claimed that it can be used to help normal car driving.

It said that the deep learning capabilities of Drive PX allowed its vehicles to learn certain self-driving capabilities in four hours instead of the two years that it took on competing systems.

According to Automotive News Europe Nvidia is working closely with Audi as its primary brand for Drive PX but then it will move to Volkswagen, Seat, Skoda, Lamborghini, and Bentley.
Tesla also appears to think that Nvida is a key element for driverless car technology. At the 2015 GPU Technology Conference last year, the company said that Tegra GPU’s will prove “really important for self-driving in the future.” Tesla does not use the Drive PX system yet, but it could go that way.

Courtesy-Fud

 

August 15, 2016 by  
Filed under Security

Carnegie Mellon University professor Lorrie Cranor, who is the US FTC’s technology guru, has debunked a myth that it is a good idea to change your password often.

Talking to Ars Technica she said that while frequent password changes can lock hackers out they make make security worse.

She told the BSides security conference in Las Vegas that frequent password changes do little to improve security and very possibly make security worse by encouraging the use of passwords that are more susceptible to cracking.

A study published in 2010 by researchers from the University of North Carolina at Chapel Hill more or less confirmed her views. The researchers obtained the cryptographic hashes to 10,000 expired accounts that once belonged to university employees, faculty, or students who had been required to change their passcodes every three months. Researchers received data not only for the last password used but also for passwords that had been changed over time.

By studying the data, the researchers identified common techniques account holders used when they were required to change passwords. A password like “tarheels#1″, for instance (excluding the quotation marks) frequently became “tArheels#1″ after the first change, “taRheels#1″ on the second change and so on. Or it might be changed to “tarheels#11″ on the first change and “tarheels#111″ on the second. Another common technique was to substitute a digit to make it “tarheels#2″, “tarheels#3″, and so on.

“The UNC researchers said if people have to change their passwords every 90 days, they tend to use a pattern and they do what we call a transformation. They take their old passwords, they change it in some small way, and they come up with a new password.”

The researchers used the transformations they uncovered to develop algorithms that could predict changes with great accuracy.

A separate study from researchers at Carleton University showed that frequent password changes hamper attackers only minimally and probably not enough to offset the inconvenience to end users.

Courtesy-Fud

August 12, 2016 by  
Filed under Security

You should probably be leery of what you see since, apparently, your computer monitor can be hacked.

Researchers at DEF CON presented a way to manipulate the tiny pixels found on a computer display.

Ang Cui and Jatin Kataria of Red Balloon Security were curious how Dell monitors worked and ended up reverse-engineering one.

They picked apart a Dell U2410 monitor and found that the display controller inside can be used to change and log the pixels across the screen.

During their DEF CON presentation, they showed how the hacked monitor could seemingly alter the details on a web page. In one example, they changed a PayPal’s account balance from $0 to $1 million, when in reality the pixels on the monitor had simply been reconfigured.

It wasn’t exactly an easy hack to pull off. To discover the vulnerability, both Cui and Kataria spent their spare time over two years, conducting research and understanding the technology inside the Dell monitor.

However, they also looked at monitors from other brands, including Samsung, Acer and Hewlett Packard, and noticed that it was theoretically possible to hack them in the same manner as well.

The key problem lies in the monitors’ firmware, or the software embedded inside. “There’s no security in the way they update their firmware, and it’s very open,” said Cui, who is also CEO of Red Balloon.

The exploit requires gaining access to the monitor itself, through the HDMI or USB port. Once done, the hack could potentially open the door for other malicious attacks, including ransomware.

For instance, cyber criminals could emblazon a permanent message on the display, and ask for payment to remove it, Kataria said. Or they could even spy on users’ monitors, by logging the pixels generated.

However, the two researchers said they made their presentation to raise awareness about computer monitor security. They’ve posted the code to their research online.

“Is monitor security important? I think it is,” Cui said.

Dell couldn’t be reached for immediate comment.

Source- http://www.thegurureview.net/computing-category/computer-monitors-are-also-vulnerable-to-hacking.html

August 5, 2016 by  
Filed under Around The Net

Amazon.com Inc announced that it has entered into a partnership with the British government to hasten the process for allowing small drones to makes deliveries.

The world’s biggest online retailer, which has laid out plans to start using drones for deliveries by 2017, said a cross-government team supported by the UK Civil Aviation Authority had provided it with the permissions necessary to explore the process.

Amazon unveiled a video last year showcasing how an unmanned drone could deliver packages, narrated by former Top Gear TV host Jeremy Clarkson.

The U.S. Federal Aviation Administration said last month the use of drones for deliveries will require separate regulation from their general use.

Wal-Mart Stores Inc said last month it was six to nine months from beginning to use drones to check warehouse inventories in the United States.

Source-http://www.thegurureview.net/aroundnet-category/u-k-regulators-give-amazon-permission-to-explore-drone-deliveries.html

Comments