SkySQL Joins IBM On SQL Merger

April 18, 2014 by admin  
Filed under Computing

Comments Off

SkySQL has announced a line of MariaDB products that combine NoSQL and SQL technology, offering users the ability to handle large unstructured data sets alongside traditional database features to ensure data consistency.

Available immediately, MariaDB Enterprise 2 and MariaDB Enterprise Cluster 2 are based on the code used in the firm’s MariaDB 10 database server, which it also released today.

According to SkySQL, the availability of an enterprise grade SQL database system with NoSQL interoperability will be a game changer for developers building revenue generating applications and database administrators in charge of large, complex environments.

The two new products have been developed with support from other partners in the open source community, including Red Hat, IBM and Google, according to the firm, and are aimed at giving IT managers more options for managing large volumes of data.

In fact, Red Hat will use MariaDB Enterprise 2 as the default database for its enterprise customers, while Google has also moved large parts of its infrastructure to MariaDB, according to Dion Cornett, VP of Global Sales for SkySQL .

Cornett said that customers have been using a wide variety of databases over the past few years in order to meet the diverse requirements of applications.

“The types of applications have evolved over time, and the challenge we now have today is that people have different IT stack structures, and trying to integrate all that has been very challenging and required lots of custom code to be created. What we’re doing with MariaDB is introduce an array of features to combine the best of both worlds,” he said.

The features are designed to allow developers and database administrators to take many different data structures and integrate them and use them in a cohesive application, in the same way that standard database tools presently allow.

These include the Connect Storage Engine, which enables access to a wide variety of file formats such as XML and CSV files, and the ability to run familiar SQL commands against that data.

A key feature is dynamic columns, which enables MariaDB to “smartly interpret” incoming data and adapt it to the data structure that best fits, according to Cornett.

“At a technical level what you’re actually looking at are files within the cells of information that can vary in size, which is not a capability you’ve traditionally had in databases and that flexibility is a big leap forward,” he said.

The new MariaDB products can also plug into the Apache Cassandra storage engine, which can take a columnar data store and read or write against it like it is a traditional SQL table.

An example of how MariaDB Enterprise 2 might be used is if a service provider has a large-scale video server and wants to combine that with billing information, Cornett said.

“The customer’s video history and what they’re consuming could be very unstructured, but the billing structure will be very fixed, and it has been something of a challenge to bring the two of those together up to this point,” he explained.

Source

Can DirectX-12 Give Mobile A Boot?

April 16, 2014 by admin  
Filed under Computing

Comments Off

Microsoft announced DirectX 12 just a few days ago and for the first time Redmond’s API is relevant beyond the PC space. Some DirectX 12 tech will end up in phones and of course Windows tablets.

Qualcomm likes the idea, along with Nvidia. Qualcomm published an blog post on the potential impact of DirectX 12 on the mobile industry and the takeaway is very positive indeed.

DirectX 12 equals less overhead, more battery life

Qualcomm says it has worked closely with Microsoft to optimise “Windows mobile operating systems” and make the most of Adreno graphics. The chipmaker points out that current Snapdragon chipsets already support DirectX 9.3 and DirectX 11.  However, the transition to DirectX 12 will make a huge difference.

“DirectX 12 will turbocharge gaming on Snapdragon enabled devices in many ways. Just a few years ago, our Snapdragon processors featured one CPU core, now most Snapdragon processors offer four. The new libraries and API’s in DirectX 12 make more efficient use of these multiple cores to deliver better performance,” Qualcomm said.

DirectX 12 will also allow the GPU to be used more efficiently, delivering superior performance per watt.

“That means games will look better and deliver longer gameplay longer on a single charge,” Qualcomm’s gaming and graphics director Jim Merrick added.

What about eye candy?

Any improvement in efficiency also tends to have a positive effect on overall quality. Developers can get more out of existing hardware, they will have more resources at their disposal, simple as that.

Qualcomm also points out that DirectX 12 is also the first version to launch on Microsoft’s mobile operating systems at the same time as its desktop and console counterparts.

The company believes this emphasizes the growing shift and consumer demand for mobile gaming. However, it will also make it easier to port desktop and console games to mobile platforms.

Of course, this does not mean that we’ll be able to play Titanfall on a Nokia Lumia, or that similarly demanding titles can be ported. However, it will speed up development and allow developers and publishers to recycle resources used in console and PC games. Since Windows Phone isn’t exactly the biggest mobile platform out there, this might be very helpful and it might attract more developers.

Source

AMD, Intel & nVidia Go OpenGL

April 7, 2014 by admin  
Filed under Computing

Comments Off

AMD, Intel and Nvidia teamed up to tout the advantages of the OpenGL multi-platform application programming interface (API) at this year’s Game Developers Conference (GDC).

Sharing a stage at the event in San Francisco, the three major chip designers explained how, with a little tuning, OpenGL can offer developers between seven and 15 times better performance as opposed to the more widely recognised increases of 1.3 times.

AMD manager of software development Graham Sellers, Intel graphics software engineer Tim Foley and Nvidia OpenGL engineer Cass Everitt and senior software engineer John McDonald presented their OpenGL techniques on real-world devices to demonstrate how these techniques are suitable for use across multiple platforms.

During the presentation, Intel’s Foley talked up three techniques that can help OpenGL increase performance and reduce driver overhead: persistent-mapped buffers for faster streaming of dynamic geometry, integrating Multidrawindirect (MDI) for faster submission of many draw calls, and packing 2D textures into arrays, so texture changes no longer break batches.

They also mentioned during their presentation that with proper implementations of these high-level OpenGL techniques, driver overhead could be reduced to almost zero. This is something that Nvidia’s software engineers have already claimed is impossible with Direct3D and only possible with OpenGL (see video below).

Nvidia’s VP of game content and technology, Ashu Rege, blogged his account of the GDC joint session on the Nvidia blog.

“The techniques presented apply to all major vendors and are suitable for use across multiple platforms,” Rege wrote.

“OpenGL can cut through the driver overhead that has been a frustrating reality for game developers since the beginning of the PC game industry. On desktop systems, driver overhead can decrease frame rate. On mobile devices, however, driver overhead is even more insidious, robbing both battery life and frame rate.”

The slides from the talk, entitled Approaching Zero Driver Overhead, are embedded below.

At the Game Developers Conference (GDC), Microsoft also unveiled the latest version of its graphics API, Directx 12, with Direct3D 12 for more efficient gaming.

Showing off the new Directx 12 API during a demo of Xbox One racing game Forza 5 running on a PC with an Nvidia Geforce Titan Black graphics card, Microsoft said Directx 12 gives applications the ability to directly manage resources to perform synchronisation. As a result, developers of advanced applications can control the GPU to develop games that run more efficiently.

Source

Do Chip Makers Have Cold Feet?

March 27, 2014 by admin  
Filed under Computing

Comments Off

It is starting to look like chip makers are having cold feet about moving to the next technology for chipmaking. Fabricating chips on larger silicon wafers is the latest cycle in a transition, but according to the Wall Street Journal chipmakers are mothballing their plans.

Companies have to make massive upfront outlays for plants and equipment and they are refusing, because the latest change could boost the cost of a single high-volume factory to as much as $10 billion from around $4 billion. Some companies have been reining in their investments, raising fears the equipment needed to produce the new chips might be delayed for a year or more.

ASML, a maker of key machines used to define features on chips, recently said it had “paused” development of gear designed to work with the larger wafers. Intel said it has slowed some payments to the Netherlands-based company under a deal to help develop the technology.

Gary Dickerson, chief executive of Applied Materials said that the move to larger wafers “has definitely been pushed out from a timing standpoint”

Source

nVidia Outs CUDA 6

March 19, 2014 by admin  
Filed under Computing

Comments Off

Nvidia has made the latest GPU programming language CUDA 6 Release Candidate available for developers to download for free.

The release arrives with several new features and improvements to make parallel programming “better, faster and easier” for developers creating next generation scientific, engineering, enterprise and other applications.

Nvidia has aggressively promoted its CUDA programming language as a way for developers to exploit the floating point performance of its GPUs. Available now, the CUDA 6 Release Candidate brings a major new update in unified memory access, which lets CUDA applications access CPU and GPU memory without the need to manually copy data from one to the other.

“This is a major time saver that simplifies the programming process, and makes it easier for programmers to add GPU acceleration in a wider range of applications,” Nvidia said in a blog post on Thursday.

There’s also the addition of “drop-in libraries”, which Nvidia said will accelerate applications by up to eight times.

“The new drop-in libraries can automatically accelerate your BLAS and FFTW calculations by simply replacing the existing CPU-only BLAS or FFTW library with the new, GPU-accelerated equivalent,” the chip designer added.

Multi-GPU Scaling has also been added to the CUDA 6 programming language, introducing re-designed BLAS and FFT GPU libraries that automatically scale performance across up to eight GPUs in a single node. Nvidia said this provides over nine teraflops of double-precision performance per node, supporting larger workloads of up to 512GB in size, more than it’s supported before.

“In addition to the new features, the CUDA 6 platform offers a full suite of programming tools, GPU-accelerated math libraries, documentation and programming guides,” Nvidia said.

The previous CUDA 5.5 Release Candidate was issued last June, and added support for ARM based processors.

Aside from ARM support, Nvidia also improved Hyper-Q support in CUDA 5.5, which allowed developers to use MPI workload prioritisation. The firm also touted improved performance analysis and improved performance for cross-compilation on x86 processors.

Source

Is AMD Worried?

March 17, 2014 by admin  
Filed under Computing

Comments Off

AMD’s Mantle has been a hot topic for quite some time and despite its delayed birth, it has finally came delivered performance in Battlefield 4. Microsoft is not sleeping it has its own answer to Mantle that we mentioned here.

Oddly enough we heard some industry people calling it DirectX 12 or DirectX Next but it looks like Microsoft is getting ready to finally update the next generation DirectX. From what we heard the next generation DirectX will fix some of the driver overhead problems that were addressed by Mantle, which is a good thing for the whole industry and of course gamers.

AMD got back to us officially stating that “AMD would like you to know that it supports and celebrates a direction for game development that is aligned with AMD’s vision of lower-level, ‘closer to the metal’ graphics APIs for PC gaming. While industry experts expect this to take some time, developers can immediately leverage efficient API design using Mantle. “

AMD also told us that we can expect some information about this at the Game Developers Conference that starts on March 17th, or in less than two weeks from now.

We have a feeling that Microsoft is finally ready to talk about DirectX Next, DirectX 11.X, DirectX 12 or whatever they end up calling it, and we would not be surprised to see Nvidia 20nm Maxwell chips to support this API, as well as future GPUs from AMD, possibly again 20nm parts.

Source

Is Samsung Ditching Android?

March 13, 2014 by admin  
Filed under Around The Net

Comments Off

Samsung appears to have delivered a huge snuff to Android OS maker Google. Samsung’s new smartwatch Gear 2 and Gear 2 Neo, the sequels to the poorly reviewed original Galaxy Gear are going to ship without Android.

Instead, the new Gears run Tizen, another open source operating system that Samsung, Intel, and others are working on. It is starting to look like Samsung wants to distance itself from its reliance on Google for software and services.

Samsung’s official reason is that Tizen has better battery life and performance. The new Gears can get up to an extra two days of battery life by running Tizen, even though they have the same size battery. The Galaxy Gear barely made it through a day on one charge.

To be fair Android isn’t optimized to run on wearable devices like smart watches, but Samsung didn’t want to wait around for Google to catch up. It was clearly concerned about beating Apple to market. So far Apple has not shown up.

Source

Is Ethernet For Autos?

March 11, 2014 by admin  
Filed under Around The Net

Comments Off

The most ubiquitous local area networking technology used by large companies may be packing its bags for a road trip.

As in-vehicle electronics become more sophisticated to support autonomous driving, cameras, and infotainment systems, Ethernet has become a top contender for connecting them.

For example, the BMW X5 automobile, released last year, used single-pair twisted wire, 100Mbps Ethernet to connect its driver-assistance cameras.

Paris-based Parrot, which supplies mobile accessories to automakers BMW, Hyundai and others, has developed in-car Ethernet. Its first Ethernet-connected systems could hit the market as soon as 2015, says Eric Riyahi, executive vice president of global operations.

Parrot’s new Ethernet-based Audio Video Bridging (AVB) technology uses Broadcom’s BroadR-Reach automotive Ethernet controller chips.

The AVB technology’s network management capabilities allows automakers to control the timing of data streams between specific network nodes in a vehicle and controls the bandwidth in order to manage competing data traffic.

Ethernet’s greater bandwidth could provide drivers with turn-by-turn navigation while a front-seat passenger streams music from the Internet, and each back-seat passenger watches streaming videos on separate displays.

“In-car Ethernet is seen as a very promising way to provide the needed bandwidth for coming new applications within the fields of connectivity, infotainment and safety,” said Hans Alminger, senior manager for Diagnostics & ECU Platform at Volvo, in a statement.

Ethernet was initially used by automakers only for on-board diagnostics. But as automotive electronics advanced, the technology has found a place in advanced driver assistance systems and infotainment platforms.

Many manufacturers also use Ethernet to connect rear vision cameras to a car’s infotainment or safety system, said Patrick Popp, chief technology officer of Automotive at TE Connectivity, a maker of car antennas and other automobile communications parts.

Currently, however, there are as many as nine proprietary auto networking specifications, including LIN, CAN/CAN-FD, MOST and FlexRay. FlexRay, for example, has a 10Mbps transmission rate. Ethernet could increase that 10 fold or more.

The effort to create a single vehicle Ethernet standard is being lead by Open Alliance and the IEEE 802.3 working group. The groups are working to establish 100Mbps and 1Gbps Ethernet as de facto standards.

The first automotive Ethernet standard draft is expected this year.

The Open Alliance claims more than 200 members, including General Motors, Ford, Daimler, Honda, Hyundai, BMW, Toyota, Volkswagen. Jaguar Land Rover, Renault, Volvo, Bosch, Freescale and Harman.

Broadcom, which makes electronic control unit chips for automobiles, is a member of the Open Alliance and is working on the effort to standardize automotive Ethernet.

Source

What Do Smaller Controllers Mean?

March 10, 2014 by admin  
Filed under Computing

Comments Off

If you want a wearable Internet of Things, the electronics have to be as tiny and as energy efficient as possible. That’s why a new microcontroller by Freescale Semiconductor is noteworthy.

The company has produced the Kinetis KLO3 MCU, a 32-bit ARM system that is 15% smaller than its previous iteration but with a 10% power improvement.

Internet of Things is a buzzword for the trend toward network-connected sensors incorporated into devices that in the past were standalone appliances. These devices use sensors to capture things like temperatures in thermostats, pressure, accelerometers, gyroscopes and other types of MEMS sensors. A microcontroller unit gives intelligence and limited computational capability to these devices, but is not a general purpose processor. One of the roles of the microcontroller is to connect the data with more sophisticated computational power.

The Kinetis KLO3 runs a lightweight embedded operating system to connect the data to other devices, such as an app that uses a more general purpose processor.

Kathleen Jachimiak, product launch manager at Freescale, said the new microcontroller will “enable further miniaturization” in connected devices. This MCU is capable of having up to 32 KB of flash memory and 2 KB of RAM.

Consumers want devices that are light, small and smart. They also want to be able to store their information and send it to an application that’s either on a phone or a PC, Jachimiak said.

This microcontroller, at 1.6 x 2.0 mm, is smaller than the dimple on a golf ball, and uses a relatively new process in its manufacturing, called wafer level chip scale packaging. The process involves building the integrated package while the die is still part of a wafer. It’s a more efficient process and produces the smallest possible package, for a given die size.

Source

Ubuntu Cross-Platform Delayed

February 26, 2014 by admin  
Filed under Computing

Comments Off

Ubuntu will not offer cross-platform apps as soon as it had hoped.

Canonical had raised hopes that its plan for Ubuntu to span PCs and mobile devices would be realised with the upcoming Ubuntu 14.04 release, providing a write-once, run-on-many template similar to that planned by Google for its Chrome OS and Android app convergence.

This is already possible on paper and the infrastructure is in place on smartphone and tablet versions of Ubuntu through its new Unity 8 user interface.

However, Canonical has decided to postpone the rollout of Unity 8 for desktop machines, citing security concerns, and it will now not appear along with the Mir display server this coming autumn.

This will apply only to apps in the Ubuntu store, and in the true spirit of open source, anyone choosing to step outside that ecosystem will be able to test the converged Ubuntu before then.

Ubuntu community manager Jono Bacon told Ars Technica, “We don’t plan on shipping apps in the new converged store on the desktop until Unity 8 and Mir lands.

“The reason is that we use app insulation to (a) run apps securely and (b) not require manual reviews (so we can speed up the time to get apps in the store). With our plan to move to Mir, our app insulation doesn’t currently insulate against X apps sniffing events in other X apps. As such, while Ubuntu SDK apps in click packages will run on today’s Unity 7 desktop, we don’t want to make them readily available to users until we ship Mir and have this final security consideration in place.

“Now, if a core-dev or motu wants to manually review an Ubuntu SDK app and ship it in the normal main/universe archives, the security concern is then taken care of with a manual review, but we are not recommending this workflow due to the strain of manual reviews.”

As well as the aforementioned security issues, there are still concerns that cross-platform apps don’t look quite as good on the desktop as native desktop versions and the intervening six months will be used to polish the user experience.

Getting the holistic experience right is essential for Ubuntu in order to attract OEMs to the converged operating system. Attempts to crowdfund its own Ubuntu handset fell short of its ambitious $20m target, despite raising $10.2 million, the single largest crowdfunding total to date.

Source