Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Is IBM Going To Court Over Unix Dispute?

April 15, 2016 by  
Filed under Computing

Comments Off on Is IBM Going To Court Over Unix Dispute?

Defunct Unix Vendor SCO, which claimed that Linux infringed its intellectual property and sought as much as $5bn in compensation from IBM, has filed notice of yet another appeal in the 13-year-old dispute.

The appeal comes after a ruling at the end of February when SCO’s arguments claiming intellectual property ownership over parts of Unix were rejected by a US district court. That judgment noted that SCO had minimal resources to defend counter-claims filed by IBM due to SCO’s bankruptcy.

In a filing, Judge David Nuffer argued that “the nature of the claims are such that no appellate court would have to decide the same issues more than once if there were any subsequent appeals”, effectively suggesting that the case had more than run its course.

On 1 March, that filing was backed up by the judge’s full explanation, declaring IBM the emphatic victor in the long-running saga.

“IT IS ORDERED AND ADJUDGED that pursuant to the orders of the court entered on July 10, 2013, February 5, 2016, and February 8, 2016, judgment is entered in favour of the defendant and plaintiff’s causes of action are dismissed with prejudice,” stated the document.

Now, though, SCO has filed yet again to appeal that judgment, although the precise grounds it is claiming haven’t yet been disclosed.

SCO is being represented by the not-inexpensive law firm of Boise, Schiller & Flexner, which successfully represented the US government against Microsoft in the antitrust case in the late 1990s. Although SCO is officially bankrupt, it’s unclear who continues to bankroll the case. Its one remaining “asset” is its claims for damages against IBM.

Meanwhile, despite the costs of the case, IBM has fought SCO vigorously, refusing even to throw a few million dollars at the company by way of compensation, which would encourage what remains of the company to pursue other, presumably easier, open source targets.

Courtesy-TheInq

 

Are Quantum Computers On The Horizon?

March 18, 2016 by  
Filed under Computing

Comments Off on Are Quantum Computers On The Horizon?

Massachusetts Institute of Technology (MIT) and Austria’s University of Innsbruck claim to have put together a working quantum computer capable of solving a simple mathematical problem.

The architecture they have devised ought to be relatively easy to scale, and could therefore form the basis of workable quantum computers in the future – with a bit of “engineering effort” and “an enormous amount of money”, according to Isaac Chuang, professor of physics, electrical engineering and computer science at MIT.

Chuang’s team has put together a prototype comprising the first five quantum bits (or qubits) of a quantum computer. This is being tested on mathematical factoring problems, which could have implications for applications that use factoring as the basis for encryption to keep information, including credit card details, secure.

The proof-of-concept has been applied only to the number 15, but the researchers claim that this is the “first scalable implementation” of quantum computing to solve Shor’s algorithm, a quantum algorithm that can quickly calculate the prime factors of large numbers.

“The team was able to keep the quantum system stable by holding the atoms in an ion trap, where they removed an electron from each atom, thereby charging it. They then held each atom in place with an electric field,” explained MIT.

Chuang added: “That way, we know exactly where that atom is in space. Then we do that with another atom, a few microns away – [a distance] about 100th the width of a human hair.

“By having a number of these atoms together, they can still interact with each other because they’re charged. That interaction lets us perform logic gates, which allow us to realise the primitives of the Shor factoring algorithm. The gates we perform can work on any of these kinds of atoms, no matter how large we make the system.”

Chuang is a pioneer in the field of quantum computing. He designed a quantum computer in 2001 based on one molecule that could be held in ‘superposition’ and manipulated with nuclear magnetic resonance to factor the number 15.

The results represented the first experimental realisation of Shor’s algorithm. But the system wasn’t scalable as it became more difficult to control as more atoms were added.

However, the architecture that Chuang and his team have put together is, he believes, highly scalable and will enable the team to build quantum computing devices capable of solving much bigger mathematical factors.

“It might still cost an enormous amount of money to build, [and] you won’t be building a quantum computer and putting it on your desktop anytime soon, but now it’s much more an engineering effort and not a basic physics question,” said Chuang.

In other quantum computing news this week, the UK government has promised £200m to support engineering and physical sciences PhD students and fuel UK research into quantum technologies, although most of the cash will be spent on Doctoral Training Partnerships rather than trying to build workable quantum computing prototypes.

Courtesy-TheInq

Courtesy-TheInq

IBM’s Watson Goes IoT

January 4, 2016 by  
Filed under Computing

Comments Off on IBM’s Watson Goes IoT

 

IBM has announced a major expansion in Europe with the establishment of a new HQ for Watson Internet of Things (IoT).

The Munich site establishes a global headquarters for the Watson IoT program which is dedicated to launching “offerings, capabilities and ecosystem partners” designed to bring the cognitive powers of the company’s game show winning supercomputer to billions of tiny devices and sensors.

Some 1,000 IBM developers, consultants, researchers and designers will join the Munich facility, which the company describes as an “innovation super center”. It is the biggest IBM investment in Europe for over 20 years.

IBM Cloud will power a series of APIs that will allow IoT developers to harness Watson within their devices.

“The IoT will soon be the largest single source of data on the planet, yet almost 90 percent of that data is never acted on,” said Harriet Green, general manager for Watson IoT and Education.

“With its unique abilities to sense, reason and learn, Watson opens the door for enterprises, governments and individuals to finally harness this real-time data, compare it with historical data sets and deep reservoirs of accumulated knowledge, and then find unexpected correlations that generate new insights to benefit business and society alike.”

The APIs were first revealed in September and new ones for the IoT were announced today.

These include the Natural Language Processing API, which contextualizes language from context and is able to respond in the same simple way; Machine Learning Watson API, which can establish patterns in order to perform a repeated task better each time or change the method to suit; Video and Image Analytics API, which can infer information from video feeds; and Text Analytics Watson API, which can glean information from unstructured text data such as Twitter feeds.

The company will also open eight regional centres across four continents to give customers in those territories the opportunity to access information and experiences.

Courtesy-TheInq

 

Seagate Goes 8TB For Surveillance

November 13, 2015 by  
Filed under Computing

Comments Off on Seagate Goes 8TB For Surveillance

Seagate has become the first hard drive company to create an 8TB unit aimed specifically at the surveillance market, targeting system integrators, end users and system installers.

The Seagate Surveillance HDD, as those wags in marketing have named it, is the highest capacity of any specialist drive for security camera set-ups, and Seagate cites its main selling points as maximizing uptime while removing the need for excess support.

“Seagate has worked closely with the top surveillance manufacturers to evolve the features of our Surveillance HDD products and deliver a customized solution that has precisely matched market needs in this evolving space for the last 10 years,” said Matt Rutledge, Seagate’s senior vice president for client storage.

“With HD recordings now standard for surveillance applications, Seagate’s Surveillance HDD product line has been designed to support these extreme workloads with ease and is capable of a 180TB/year workload, three times that of a standard desktop drive.

“It also includes surveillance-optimized firmware to support up to 64 cameras and is the only product in the industry that can support surveillance solutions, from single-bay DVRs to large multi-bay NVR systems.”

The 3.5in drive is designed to run 24/7 and is able to capture 800 hours of high-definition video from up to 64 cameras simultaneously, making it ideal for shopping centers, urban areas, industrial complexes and anywhere else you need to feel simultaneously safe and violated. Its capacity will allow 6PB in a 42U rack.

Included in the deal is the Seagate Rescue Service, capable of restoring lost data in two weeks if circumstances permit, and sold with end users in mind for whom an IT support infrastructure is either non-existent or off-site. The service has a 90 percent success rate and is available as part of the drive cost for the first three years.

Seagate demonstrated the drive today at the China Public Security Expo. Where better than the home of civil liberty infringement to show off the new drive?

Earlier this year, Seagate announced a new co-venture with SSD manufacturer Micron, which will come as a huge relief after the recent merger announcement between WD and SanDisk.

Courtesy-http://www.thegurureview.net/computing-category/seagate-goes-8tb-for-surveillance.html

Suse Goes 64-bit ARM Servers

July 28, 2015 by  
Filed under Computing

Comments Off on Suse Goes 64-bit ARM Servers

Suse wants to speed the development of server systems based on 64-bit ARM processors.

The outfit said that it is making available to its partners a version of Suse Linux Enterprise 12 ported to ARM’s 64-bit architecture (AArch64).

This will enable them to develop, test and deliver products to the market based on ARM chips.

Suse has also implemented support for AArch64 into its openSUSE Build Service. This allows the community to build packages against real 64-bit ARM hardware and the Suse Linux Enterprise 12 binaries.

Hopefully this will improve the time to market for ARM-based solutions, the firm said.

Suse partners include chip makers AMD AppliedMicro and Cavium, while Dell, HP and SoftIron. Suse wants ARM processors to be part of a scalable technology platform in the data centre.

Through participation in the programme, partners will be able to build solutions for various applications, from purpose-built appliances for security, medical and network functions, to hyperscale computing, distributed storage and software-defined networking.

There are multiple vendors using the same core technology licensed from ARM. This provides a common base for the OS vendors, like Suse, to build support in their kernel.

Suse has some competition for ARM-based systems. Last year, Red Hat started up its ARM Partner Early Access Programme (PEAP), while Canonical has offered ARM support in its Ubuntu platform for several years now, including a long-term support (LTS) release last year that included the OpenStack cloud computing framework.

Source

IBM Buys Blue Box

June 15, 2015 by  
Filed under Computing

Comments Off on IBM Buys Blue Box

IBM HAS ACQUIRED Blue Box in an attempt to make its cloud offering even bluer. The Seattle-based company specialises in simple service-as-a-platform clouds based on OpenStack.

This, of course, fits in with IBM’s new direction of a Power PC, OpenStack cloud-based world, as demonstrated by its collaboration with MariaDB on TurboLAMP.

IBM’s move to the cloud is starting to pay off, seeing revenue of $7.7bn in the 12 months to March 2015 and growing more than 16 percent in the first quarter of this year.

The company plans to use the new acquisition to create rapid, integrating cloud-based applications and on-premise systems within the OpenStack managed cloud.

Blue Box also brings a remotely managed OpenStack to provide customers with a local cloud, better visibility control and tighter security.

“IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meets their current and future business needs,” said IBM general manager of cloud services Jim Comfort.

“The acquisition of Blue Box accelerates IBM’s open cloud strategy, making it easier for our clients to move data and applications across clouds and adopt hybrid cloud environments.”

Blue Box will offer customers a more cohesive, consistent and simplified experience, while at the same time integrating with existing IBM packages like the Bluemix digital innovation platform. The firm also offers a single unified control panel for customer operations.

“No brand is more respected in IT than IBM. Blue Box is building a similarly respected brand in OpenStack,” said Blue Box founder and CTO Jesse Proudman.

“Together, we will deliver the technology and products businesses need to give their application developers an agile, responsive infrastructure across public and private clouds.

“This acquisition signals the beginning of new OpenStack options delivered by IBM. Now is the time to arm customers with more efficient development, delivery and lower cost solutions than they’ve seen thus far in the market.”

IBM has confirmed that it plans to help Blue Box customers to grow their technology portfolio, while taking advantage of the broader IBM product set.

Source

Will A.I. Create The Next Industrial Revolution?

June 2, 2015 by  
Filed under Computing

Comments Off on Will A.I. Create The Next Industrial Revolution?

Artificial Intelligence will be responsible for the next industrial revolution, experts in the field have claimed, as intelligent computer systems replace certain human-operated jobs.

Four computer science experts talked about how advances in AI could lead to a “hollowing out” of middle-income jobs during a panel debate hosted by ClickSoftware about the future of technology.

“It’s really important that we take AI seriously. It will lead to the fourth industrial revolution and will change the world in ways we cannot predict now,” said AI architect and author George Zarkadakis.

His mention of the “fourth industrial revolution” refers to the computerization of the manufacturing industry.

If the first industrial revolution was the mechanisation of production using water and steam power, followed by the second which introduced mass production with the help of electric power, then the third is what we are currently experiencing: the digital revolution and the use of electronics and IT to further automate production.

The fourth industrial revolution, which is sometimes referred to as Industry 4.0, is the vision of the ‘smart factory’, where cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralized decisions.

These cyber-physical systems communicate and cooperate with each other and humans in real time over the Internet of Things.

Dan O’Hara, professor of cognitive computing at Goldsmiths, University of London, explained that this fourth industrial revolution will not be the same kind of “hollowing out” of jobs that we saw during the last one.

“It [won’t be] manual labour replaced by automation, but it’ll be the hollowing out of middle-income jobs, medium-skilled jobs,” he said.

“The industries that will be affected the most from a replacement with automation are construction, accounts and transport. But the biggest [industry] of all, remembering this is respective to the US, is retail and sales.”

O’Hara added that many large organisations’ biggest expense is people, who already work alongside intelligent computer systems, and this area is most likely to be affected as companies look to reduce costs.

“Anything that’s working on an AI-based system is bound to be very vulnerable to the replacement by AI as it’s easily automated already,” he said.

However, while AI developments in the retail space could lead to the replacement of jobs, it is also rather promising at the same time.

Mark Bishop, professor of cognitive computing at Goldsmiths, highlighted that AI could save businesses money if it becomes smart enough to determine price variants in company spending, for example, scanning through years of an organisation’s invoice database and detecting the cheapest costs and thus saving on outgoings.

While some worry that AI will take over jobs, others have said that they will replace humans altogether.

John Lewis IT chief Paul Coby said earlier this year that the blending of AI and the IoT in the future could signal the end of civilisation as we know it.

Coby explained that the possibilities are already with us in terms of AI and that we ought to think about how “playing with the demons” could be detrimental to our future.

Apple co-founder Steve Wozniak added to previous comments from Stephen Hawking and Elon Musk with claims that “computers are going to take over from humans”.

Woz made his feelings on AI known during an interview with the Australian Financial Review, and agreed with Hawking and Musk that its potential to surpass humans is worrying.

“Computers are going to take over from humans, no question. Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people,” he said.

Source

SUSE Brings Hadoop To IBM z Mainframes

April 1, 2015 by  
Filed under Computing

Comments Off on SUSE Brings Hadoop To IBM z Mainframes

SUSE and Apache Hadoop vendor Veristorm are teaming up to bring Hadoop to IBM z and IBM Power systems.

The result will mean that regardless of system architecture, users will be able to run Apache Hadoop within a Linux container on their existing hardware, meaning that more users than ever will be able to process big data into meaningful information to inform their business decisions.

SUSE’s Veristorm Data Hub and vStorm Enterprise Hadoop will now be available as zDoop, the first mainframe-compatible Hadoop iteration, running on SUSE Linux Enterprise Server for System z, either on IBM Power12 or Power8 machines in little-endian mode, which makes it significantly easier for x86 based software to be ported to the IBM platform.

SUSE and Veristorm have also committed to work together on educating partners and channels on the benefits of the overall package.

Naji Almahmoud, head of global business development for SUSE, said: “The growing need for big data processing to make informed business decisions is becoming increasingly unavoidable.

“However, existing solutions often struggle to handle the processing load, which in turn leads to more servers and difficult-to-manage sprawl. This partnership with Veristorm allows enterprises to efficiently analyse their mainframe data using Hadoop.”

Veristorm launched Hadoop for Linux in April of last year, explaining that it “will help clients to avoid staging and offloading of mainframe data to maintain existing security and governance controls”.

Sanjay Mazumder, CEO of Veristorm, said that the partnership will help customers “maximize their processing ability and leverage their richest data sources” and deploy “successful, pragmatic projects”.

SUSE has been particularly active of late, announcing last month that its software-defined Enterprise Storage product, built around the open source Ceph framework, was to become available as a standalone product for the first time.

Source

NSA Developing System To Crack Encryption

January 13, 2014 by  
Filed under Computing

Comments Off on NSA Developing System To Crack Encryption

The U.S. National Security Agency is working to develop a computer that could ultimately break most encryption programs, whether they are used to protect other nations’ spying programs or consumers’ bank accounts, according to a report by the Washington Post.

The report, which the newspaper said was based on documents leaked by former NSA contractor Edward Snowden, comes amid continuing controversy over the spy agency’s program to collect the phone records Internet communications of private citizens.

In its report, The Washington Post said that the NSA is trying to develop a so-called “quantum computer” that could be used to break encryption codes used to cloak sensitive information.

Such a computer, which would be able to perform several calculations at once instead of in a single stream, could take years to develop, the newspaper said. In addition to being able to break through the cloaks meant to protect private data, such a computer would have implications for such fields as medicine, the newspaper reported.

The research is part of a $79.7 million research program called “Penetrating Hard Targets,” the newspaper said. Other, non-governmental researchers are also trying to develop quantum computers, and it is not clear whether the NSA program lags the private efforts or is ahead of them.

Snowden, living in Russia with temporary asylum, last year leaked documents he collected while working for the NSA. The United States has charged him with espionage, and more charges could follow.

His disclosures have sparked a debate over how much leeway to give the U.S. government in gathering information to protect Americans from terrorism, and have prompted numerous lawsuits.

Last week, a federal judge ruled that the NSA’s collection of phone call records is lawful, while another judge earlier in December questioned the program’s constitutionality. The issue is now more likely to move before the U.S. Supreme Court.

On Thursday, the editorial board of the New York Times said that the U.S. government should grant Snowden clemency or a plea bargain, given the public value of revelations over the National Security Agency’s vast spying programs.

Source

Google Goes Quantum

October 22, 2013 by  
Filed under Computing

Comments Off on Google Goes Quantum

When is a blink not a natural blink? For Google the question has such ramifications that it has devoted a supercomputer to solving the puzzle.

Slashgear reports that the internet giant is using its $10 million quantum computer to find out how products like Google Glass can differentiate between a natural blink and a deliberate blink used to trigger functionality.

The supercomputer based at Google’s Quantum Artificial Intelligence Lab is a joint venture with NASA and is being used to refine the algorithms used for new forms of control such as blinking. The supercomputer uses D-Wave chips kept at as near to absolute zero as possible, which makes it somewhat impractical for everyday wear but amazingly fast at solving brainteasers.

A Redditor reported earlier this year that Google Glass is capable of taking pictures by responding to blinking, however the feature is disabled in the software code as the technology had not advanced enough to differentiate between natural impulse and intentional request.

It is easy to see the potential of blink control. Imagine being able to capture your life as you live it, exactly the way you see it, without anyone ever having to stop and ask people to say “cheese”.

Google Glass is due for commercial release next year but for the many beta testers and developers who already have one this research could lead to an even richer seam of touchless functionality.

If nothing else you can almost guarantee that Q will have one ready for Daniel Craig’s next James Bond outing.

Source

Next Page »