Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Is IBM Going To Court Over Unix Dispute?

April 15, 2016 by  
Filed under Computing

Comments Off on Is IBM Going To Court Over Unix Dispute?

Defunct Unix Vendor SCO, which claimed that Linux infringed its intellectual property and sought as much as $5bn in compensation from IBM, has filed notice of yet another appeal in the 13-year-old dispute.

The appeal comes after a ruling at the end of February when SCO’s arguments claiming intellectual property ownership over parts of Unix were rejected by a US district court. That judgment noted that SCO had minimal resources to defend counter-claims filed by IBM due to SCO’s bankruptcy.

In a filing, Judge David Nuffer argued that “the nature of the claims are such that no appellate court would have to decide the same issues more than once if there were any subsequent appeals”, effectively suggesting that the case had more than run its course.

On 1 March, that filing was backed up by the judge’s full explanation, declaring IBM the emphatic victor in the long-running saga.

“IT IS ORDERED AND ADJUDGED that pursuant to the orders of the court entered on July 10, 2013, February 5, 2016, and February 8, 2016, judgment is entered in favour of the defendant and plaintiff’s causes of action are dismissed with prejudice,” stated the document.

Now, though, SCO has filed yet again to appeal that judgment, although the precise grounds it is claiming haven’t yet been disclosed.

SCO is being represented by the not-inexpensive law firm of Boise, Schiller & Flexner, which successfully represented the US government against Microsoft in the antitrust case in the late 1990s. Although SCO is officially bankrupt, it’s unclear who continues to bankroll the case. Its one remaining “asset” is its claims for damages against IBM.

Meanwhile, despite the costs of the case, IBM has fought SCO vigorously, refusing even to throw a few million dollars at the company by way of compensation, which would encourage what remains of the company to pursue other, presumably easier, open source targets.

Courtesy-TheInq

 

IBM’s Watson Goes IoT

January 4, 2016 by  
Filed under Computing

Comments Off on IBM’s Watson Goes IoT

 

IBM has announced a major expansion in Europe with the establishment of a new HQ for Watson Internet of Things (IoT).

The Munich site establishes a global headquarters for the Watson IoT program which is dedicated to launching “offerings, capabilities and ecosystem partners” designed to bring the cognitive powers of the company’s game show winning supercomputer to billions of tiny devices and sensors.

Some 1,000 IBM developers, consultants, researchers and designers will join the Munich facility, which the company describes as an “innovation super center”. It is the biggest IBM investment in Europe for over 20 years.

IBM Cloud will power a series of APIs that will allow IoT developers to harness Watson within their devices.

“The IoT will soon be the largest single source of data on the planet, yet almost 90 percent of that data is never acted on,” said Harriet Green, general manager for Watson IoT and Education.

“With its unique abilities to sense, reason and learn, Watson opens the door for enterprises, governments and individuals to finally harness this real-time data, compare it with historical data sets and deep reservoirs of accumulated knowledge, and then find unexpected correlations that generate new insights to benefit business and society alike.”

The APIs were first revealed in September and new ones for the IoT were announced today.

These include the Natural Language Processing API, which contextualizes language from context and is able to respond in the same simple way; Machine Learning Watson API, which can establish patterns in order to perform a repeated task better each time or change the method to suit; Video and Image Analytics API, which can infer information from video feeds; and Text Analytics Watson API, which can glean information from unstructured text data such as Twitter feeds.

The company will also open eight regional centres across four continents to give customers in those territories the opportunity to access information and experiences.

Courtesy-TheInq

 

Apple Buys Parts of Qualcomm

December 31, 2015 by  
Filed under Around The Net

Comments Off on Apple Buys Parts of Qualcomm

Apple has bought one of Qualcomm’s Taiwan graphics labs and is operating it pretty much under everyone’s radar to “invent” something that Qualcomm tried and failed to make successful.

The lab was used by Qualcomm to develop Interferometric Modulator Display and Apple Insider claims it is now being used to develop thinner, lighter, brighter and more energy-efficient screens.

The lab employs at least 50 engineers and has recruited talent from display maker AU Optronics and Qualcomm. Outside the lab there is no signage or much to indicate that the Fruity Cargo Cult has assumed control.

Government records show that the building is registered to Apple Taiwan, and a staff in the building were observed wearing Apple ID badges.

Bloomberg thinks Apple wants to “reduce reliance on the technology developed by suppliers such as Samsung, LG, Sharp and Japan and instead “develop the production processes in-house and outsource to smaller manufacturers such as Taiwan’s AU Optronics or Innolux.

Apple currently uses LCD screens in its Macs and iOS devices and an OLED display for Apple Watch and the new lab was where Qualcomm tried to develop to develop its own Mirasol displays.

Mirasol use a different technology to backlit LCDs or OLED. It uses an array of microscopic mirror-like elements that can reflect light of a specific colour. It does not need a backlight and only uses energy when being switched on or off, like E-Ink.

The downside to IMOD has historically been that it reproduces flat, unsaturated colours, a problem that may be possible to fix. Qualcomm introduced a Toq smartwatch with an IMOD screen, but the device flopped.

Qualcomm took a $142 million charge on its Mirasol display business and a year ago there were rumours Qualcomm was selling off its Longtan Mirasol panel plant to TSMC.

What appears to have happened is that Jobs Mob might have bought more than just the facility, and instead has some interest in using Mirasol IMOD technology which could offer an advanced technological breakthrough in enabling a new class of low-power displays for use in phones, tablets or wearables.

Courtesy-Fud

Suse Goes 64-bit ARM Servers

July 28, 2015 by  
Filed under Computing

Comments Off on Suse Goes 64-bit ARM Servers

Suse wants to speed the development of server systems based on 64-bit ARM processors.

The outfit said that it is making available to its partners a version of Suse Linux Enterprise 12 ported to ARM’s 64-bit architecture (AArch64).

This will enable them to develop, test and deliver products to the market based on ARM chips.

Suse has also implemented support for AArch64 into its openSUSE Build Service. This allows the community to build packages against real 64-bit ARM hardware and the Suse Linux Enterprise 12 binaries.

Hopefully this will improve the time to market for ARM-based solutions, the firm said.

Suse partners include chip makers AMD AppliedMicro and Cavium, while Dell, HP and SoftIron. Suse wants ARM processors to be part of a scalable technology platform in the data centre.

Through participation in the programme, partners will be able to build solutions for various applications, from purpose-built appliances for security, medical and network functions, to hyperscale computing, distributed storage and software-defined networking.

There are multiple vendors using the same core technology licensed from ARM. This provides a common base for the OS vendors, like Suse, to build support in their kernel.

Suse has some competition for ARM-based systems. Last year, Red Hat started up its ARM Partner Early Access Programme (PEAP), while Canonical has offered ARM support in its Ubuntu platform for several years now, including a long-term support (LTS) release last year that included the OpenStack cloud computing framework.

Source

IBM Buys Blue Box

June 15, 2015 by  
Filed under Computing

Comments Off on IBM Buys Blue Box

IBM HAS ACQUIRED Blue Box in an attempt to make its cloud offering even bluer. The Seattle-based company specialises in simple service-as-a-platform clouds based on OpenStack.

This, of course, fits in with IBM’s new direction of a Power PC, OpenStack cloud-based world, as demonstrated by its collaboration with MariaDB on TurboLAMP.

IBM’s move to the cloud is starting to pay off, seeing revenue of $7.7bn in the 12 months to March 2015 and growing more than 16 percent in the first quarter of this year.

The company plans to use the new acquisition to create rapid, integrating cloud-based applications and on-premise systems within the OpenStack managed cloud.

Blue Box also brings a remotely managed OpenStack to provide customers with a local cloud, better visibility control and tighter security.

“IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meets their current and future business needs,” said IBM general manager of cloud services Jim Comfort.

“The acquisition of Blue Box accelerates IBM’s open cloud strategy, making it easier for our clients to move data and applications across clouds and adopt hybrid cloud environments.”

Blue Box will offer customers a more cohesive, consistent and simplified experience, while at the same time integrating with existing IBM packages like the Bluemix digital innovation platform. The firm also offers a single unified control panel for customer operations.

“No brand is more respected in IT than IBM. Blue Box is building a similarly respected brand in OpenStack,” said Blue Box founder and CTO Jesse Proudman.

“Together, we will deliver the technology and products businesses need to give their application developers an agile, responsive infrastructure across public and private clouds.

“This acquisition signals the beginning of new OpenStack options delivered by IBM. Now is the time to arm customers with more efficient development, delivery and lower cost solutions than they’ve seen thus far in the market.”

IBM has confirmed that it plans to help Blue Box customers to grow their technology portfolio, while taking advantage of the broader IBM product set.

Source

SUSE Brings Hadoop To IBM z Mainframes

April 1, 2015 by  
Filed under Computing

Comments Off on SUSE Brings Hadoop To IBM z Mainframes

SUSE and Apache Hadoop vendor Veristorm are teaming up to bring Hadoop to IBM z and IBM Power systems.

The result will mean that regardless of system architecture, users will be able to run Apache Hadoop within a Linux container on their existing hardware, meaning that more users than ever will be able to process big data into meaningful information to inform their business decisions.

SUSE’s Veristorm Data Hub and vStorm Enterprise Hadoop will now be available as zDoop, the first mainframe-compatible Hadoop iteration, running on SUSE Linux Enterprise Server for System z, either on IBM Power12 or Power8 machines in little-endian mode, which makes it significantly easier for x86 based software to be ported to the IBM platform.

SUSE and Veristorm have also committed to work together on educating partners and channels on the benefits of the overall package.

Naji Almahmoud, head of global business development for SUSE, said: “The growing need for big data processing to make informed business decisions is becoming increasingly unavoidable.

“However, existing solutions often struggle to handle the processing load, which in turn leads to more servers and difficult-to-manage sprawl. This partnership with Veristorm allows enterprises to efficiently analyse their mainframe data using Hadoop.”

Veristorm launched Hadoop for Linux in April of last year, explaining that it “will help clients to avoid staging and offloading of mainframe data to maintain existing security and governance controls”.

Sanjay Mazumder, CEO of Veristorm, said that the partnership will help customers “maximize their processing ability and leverage their richest data sources” and deploy “successful, pragmatic projects”.

SUSE has been particularly active of late, announcing last month that its software-defined Enterprise Storage product, built around the open source Ceph framework, was to become available as a standalone product for the first time.

Source

Is Oracle’s Linux 7 Unbreakable?

August 5, 2014 by  
Filed under Computing

Comments Off on Is Oracle’s Linux 7 Unbreakable?

Oracle has announced the release of its Linux distribution Oracle Linux 7.

Oracle Linux 7 is the latest release of the company’s version of its enterprise grade Linux flavour that is a fork of Red Hat Enterprise Linux.

This latest release adds a range of features including XFS, Btrfs, Linux Containers (LXC), Dtrace, Ksplice, Xen enhancements and the Oracle’s Unbreakable Enterprise Kernel Release 3.

“Oracle Linux continues to provide the most flexible options for customers and partners, allowing them to easily innovate, collaborate, and create enterprise-grade solutions,” said Oracle SVP of Linux and Virtualization Engineering Wim Coekaerts.

“With Oracle Linux 7, users have more freedom to choose the technologies and solutions that best meet their business objectives. Oracle Linux allows users to benefit from an open approach for emerging technologies, like Openstack, and allows them to meet the performance and reliability requirements of the modern data center.”

Oracle’s outspoken CEO Larry Ellison recently claimed that its servers were “untouchable”, two weeks after it released patches for 36 vulnerabilities in its Java platform.

The company recently won a court case against Google after successfully arguing that the APIs used in Google’s Android mobile operating system infringed Oracle copyrights.

The Oracle Linux 7 operating system is freely downloadable and distributed with updates and security fixes subsequently available from Oracle Yum servers. A paid option is also available for anyone wishing to buy Oracle support.

Oracle Linux 7 has a 10-year production lifecycle, or lifetime support for subscribers, with additional upgrade support available for users of the Unbreakable Enterprise Kernel.

Source

IBM Goes Linux

September 27, 2013 by  
Filed under Computing

Comments Off on IBM Goes Linux

IBM reportedly will invest $1bn in Linux and other open source technologies for its Power system servers.

The firm is expected to announce the news at the Linuxcon 2013 conference in New Orleans, pledging to spend $1bn over five years on Linux and related open source technologies.

The software technology will be used on IBM’s Power line of servers, which are based on the chip technology of the same name and used for running large scale systems in data centres.

Previously IBM Power systems have mostly run IBM’s proprietary AIX version of Unix, though some used in high performance computing (HPC) configurations have run Linux.

If true, this will make the second time IBM coughs up a $1bn investment in Linux. IBM gave the open source OS the same vote of confidence around 13 years ago.

According to the Wall Street Journal, IBM isn’t investing in Linux to convert its existing AIX customers, but instead Linux will help support data centre applications driving big data, cloud computing and analytics.

“We continue to take share in Unix, but it’s just not growing as fast as Linux,” said IBM VP of Power development Brad McCredie.

The $1bn is expected to go mainly for facilities and staffing to help Power system users move to Linux, with a new centre being opened in France especially to help manage that transition.

Full details are planned to be announced at Linuxcon later today.

Last month, IBM swallowed Israeli security firm Trusteer to boost its customers’ cyber defences with the company’s anti-hacking technology.

Announcing that it had signed a definitive agreement with Trusteer to create a security lab in Israel, IBM said it planned to focus on mobile and application security, counter-fraud and malware detection staffed by 200 Trusteer and IBM researchers.

Source

Dell Promises ExaScale By 2015

June 17, 2013 by  
Filed under Computing

Comments Off on Dell Promises ExaScale By 2015

Dell has claimed it will make exascale computing available by 2015, as the firm enters the high performance computing (HPC) market.

Speaking at the firm’s Enterprise Forum in San Jose, Sam Greenblatt, chief architect of Dell’s Enterprise Solutions Group, said the firm will have exascale systems by 2015, ahead of rival vendors. However, he added that development will not be boosted by a doubling in processor performance, saying Moore’s Law is no longer valid and is actually presenting a barrier for vendors.

“It’s not doubling every two years any more, it has flattened out significantly,” he said. According to Greenblatt, the only way firms can achieve exascale computing is through clustering. “We have to design servers that can actually get us to exascale. The only way you can do it is to use a form of clustering, which is getting multiple parallel processes going,” he said.

Not only did Greenblatt warn that hardware will have to be packaged differently to reach exascale performance, he said that programmers will also need to change. “This is going to be an area that’s really great, but the problem is you never programmed for this area, you programmed to that old Von Neumann machine.”

According to Greenblatt, shifting of data will also be cut down, a move that he said will lead to network latency being less of a performance issue.”Things are going to change very dramatically, your data is going to get bigger, processing power is going to get bigger and network latency is going to start to diminish, because we can’t move all this [data] through the pipe,” he said.

Greenblatt’s reference to data being closer to the processor is a nod to the increasing volume of data that is being handled. While HPC networking firms such as Mellanox and Emulex are increasing bandwidths on their respective switch gear, bandwidth increases are being outpaced by the growth in the size of datasets used by firms deploying analytics workloads or academic research.

That Dell is projecting 2015 for the arrival of exascale clusters is at least a few years sooner than firms such as Intel, Cray and HP, all of which have put a “by 2020″ timeframe on the challenge. However what Greenblatt did not mention is the projected power efficiency of Dell’s 2015 exascale cluster, something that will be critical to its usability.

Source

TI Chip Goes 1080p On Android Devices

July 10, 2011 by  
Filed under Computing

Comments Off on TI Chip Goes 1080p On Android Devices

Texas Instruments on Tuesday said its OMAP chip had been certified to unlock full 1080p movies from Netflix for Google’s Android 2.3 based devices, which includes smartphones and tablets.

TI’s on-chip security feature, called M-Shield, will be able to decode 1080p high-definition movie streaming from Netflix, stated Fred Cohen, director of the OMAP user experience team at TI. A security layer unlocks the encoded video, which can then be viewed on smartphones and tablets or TV sets connected through an HDMI (high-definition multimedia interface) port.

The purpose of having this technology is to provide end-to-end security for protected video content, Cohen said. Movie studios are making more high-definition 1080p content available and are adamant about protecting their product, which are considered premium content.

The on-chip feature minimizes the ability to copy content, as it is easy to take control of a rooted Android device, Cohen said. It’s easy for users to access memory where the stream is temporarily stored, and then write the movie to another device.

“You have to protect those devices,” Cohen said. “We have implemented a firewall.”

TI’s security technology is to provide a security layer so devices get access to high-definition movies, Cohen said.

Netflix provides different levels of security certification depending on features such as the video quality and resolution, Cohen said. Netflix did not return a request for comment on whether it was streaming 1080p video content to mobile devices, or whether chip makers required certification to unlock secure 1080p content.

Read More…

Next Page »