Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

SkySQL Joins IBM On SQL Merger

April 18, 2014 by  
Filed under Computing

Comments Off on SkySQL Joins IBM On SQL Merger

SkySQL has announced a line of MariaDB products that combine NoSQL and SQL technology, offering users the ability to handle large unstructured data sets alongside traditional database features to ensure data consistency.

Available immediately, MariaDB Enterprise 2 and MariaDB Enterprise Cluster 2 are based on the code used in the firm’s MariaDB 10 database server, which it also released today.

According to SkySQL, the availability of an enterprise grade SQL database system with NoSQL interoperability will be a game changer for developers building revenue generating applications and database administrators in charge of large, complex environments.

The two new products have been developed with support from other partners in the open source community, including Red Hat, IBM and Google, according to the firm, and are aimed at giving IT managers more options for managing large volumes of data.

In fact, Red Hat will use MariaDB Enterprise 2 as the default database for its enterprise customers, while Google has also moved large parts of its infrastructure to MariaDB, according to Dion Cornett, VP of Global Sales for SkySQL .

Cornett said that customers have been using a wide variety of databases over the past few years in order to meet the diverse requirements of applications.

“The types of applications have evolved over time, and the challenge we now have today is that people have different IT stack structures, and trying to integrate all that has been very challenging and required lots of custom code to be created. What we’re doing with MariaDB is introduce an array of features to combine the best of both worlds,” he said.

The features are designed to allow developers and database administrators to take many different data structures and integrate them and use them in a cohesive application, in the same way that standard database tools presently allow.

These include the Connect Storage Engine, which enables access to a wide variety of file formats such as XML and CSV files, and the ability to run familiar SQL commands against that data.

A key feature is dynamic columns, which enables MariaDB to “smartly interpret” incoming data and adapt it to the data structure that best fits, according to Cornett.

“At a technical level what you’re actually looking at are files within the cells of information that can vary in size, which is not a capability you’ve traditionally had in databases and that flexibility is a big leap forward,” he said.

The new MariaDB products can also plug into the Apache Cassandra storage engine, which can take a columnar data store and read or write against it like it is a traditional SQL table.

An example of how MariaDB Enterprise 2 might be used is if a service provider has a large-scale video server and wants to combine that with billing information, Cornett said.

“The customer’s video history and what they’re consuming could be very unstructured, but the billing structure will be very fixed, and it has been something of a challenge to bring the two of those together up to this point,” he explained.

Source

IBM Breaks Big Data Record

February 28, 2014 by  
Filed under Computing

Comments Off on IBM Breaks Big Data Record

IBM Labs claims to have broken a speed record for Big Data, which the company says could help boost internet speeds to 200 to 400Gbps using “extremely low power”.

The scientists achieved the speed record using a prototype device presented at the International Solid-State Circuits Conference (ISSCC) this week in San Francisco.

Apparently the device, which employs analogue-to-digital conversion (ADC) technology, could be used to improve the transfer speed of Big Data between clouds and data centres to four times faster than existing technology.

IBM said its device is fast enough that 160GB – the equivalent of a two-hour 4K ultra-high definition (UHD) movie or 40,000 music tracks – could be downloaded in a few seconds.

The IBM researchers have been developing the technology in collaboration with Swiss research institution Ecole Polytechnique Fédérale de Lausanne (EPFL) to tackle the growing demands of global data traffic.

“As Big Data and internet traffic continues to grow exponentially, future networking standards have to support higher data rates,” the IBM researchers explained, comparing data transfer per day in 1992 of 100GB to today’s two Exabytes per day, a 20 million-fold increase.

“To support the increase in traffic, ultra-fast and energy efficient analogue-to-digital converter (ADC) technology [will] enable complex digital equalisation across long-distance fibre channels.”

An ADC device converts analogue signals to digital, estimating the right combination of zeros and ones to digitally represent the data so it can be stored on computers and analysed for patterns and predictive outcomes.

“For example, scientists will use hundreds of thousands of ADCs to convert the analogue radio signals that originate from the Big Bang 13 billion years ago to digital,” IBM said.

The ADC technology has been developed as part of an international project called Dome, a collaboration between the Netherlands Institute for Radio Astronomy (ASTRON), DOME-South Africa and IBM to build the Square Kilometer Array (SKA), which will be the world’s largest and most sensitive radio telescope when it’s completed.

“The radio data that the SKA collects from deep space is expected to produce 10 times the global internet traffic and the prototype ADC would be an ideal candidate to transport the signals fast and at very low power – a critical requirement considering the thousands of antennas which will be spread over 1,900 miles,” IBM expalined.

IBM Research Systems department manager Dr Martin Schmatz said, “Our ADC supports Institute of Electrical and Electronics Engineers (IEEE) standards for data communication and brings together speed and energy efficiency at 32 nanometers, enabling us to start tackling the largest Big Data applications.”

He said that IBM is developing the technology for its own family of products, ranging from optical and wireline communications to advanced radar systems.

“We are bringing our previous generation of the ADC to market less than 12 months since it was first developed and tested,” Schmatz added, noting that the firm will develop the technology in communications systems such as 400Gbps opticals and advanced radars.

Source

Techies Demand More Money

February 11, 2014 by  
Filed under Around The Net

Comments Off on Techies Demand More Money

Employers may need to loosen their purse strings to retain their IT staffers in 2014, according to a salary survey from IT career websiteDice.com.

Among the tech workers who anticipate changing employers in 2014, 68 percent listed more compensation as their reason for leaving. Other factors include improved working conditions (48 percent), more responsibility (35 percent) and the possibility of losing their job (20 percent). The poll, conducted online between Oct. 14 and Nov. 29 last year, surveyed 17,236 tech professionals.

Fifty-four percent of the workers polled weren’t content with their compensation. This figure is down from 2012′s survey, when 57 percent of respondents were displeased with their pay.

The decrease in salary satisfaction could mean companies will face IT staff retention challenges this year, since 65 percent of respondents said they’re confident they can find a new, better position in 2014.

This dissatisfaction over pay comes even though the survey, released Wednesday, showed that the average tech salary rose 2.6 percent in 2013 to US$87,811 and that more companies gave merit raises. The main reason for last year’s bump in pay, according to 45 percent of respondents, was a merit raise. In comparison, the average tech salary was $85,619 in 2012 and 40 percent of those polled said they received a merit raise.

Meanwhile, 26 percent of respondents attributed their 2013 salary increase to taking a higher-paying job at another company.

Employers realize tech talent is coveted and are attempting to keep workers satisfied by offering them a variety of incentives, the survey found. In 2013, 66 percent of employers provided incentives to retain workers. The two most popular incentives were increased compensation and more interesting work. Incentives that allow employees to better balance their work and personal lives were also offered, such as telecommuting and a flexible work schedule.

Skills that commanded six-figure jobs in 2013 came from some of the hottest areas of IT. Data science led the way with big data backgrounds yielding some of the highest salaries. People skilled in Knowing R, the popular statistical computing language, can expect to make $115,531 on average, while those with NoSQL database development skills command an average salary of $114,796. IT pros skilled in MapReduce to process large data sets make $114,396 on average.

Source

Did A Hacker OD?

January 16, 2014 by  
Filed under Computing

Comments Off on Did A Hacker OD?

Top hacker Barnaby Jack died from mixing too many drugs in one session, a coroner’s report shows. Kiwi-born Jack was supposed to give a talk at a security conference when he was found dead in his bed.

Conspiracy nuts raised an eyebrow or two when it was revealed that Jack’s death occurred shortly before he was due to demonstrate how heart implants could be hacked at the Black Hat security conference in Las Vegas. He did not have a mark on him and showed no signs of trauma. However, now a coroner’s report has shown that Jack had a mix of heroin, cocaine and prescription drugs in his system. And he died of “acute mixed drug intoxication.”

Jack rose to fame after a 2010 demonstration, in which he hacked a cash machine, making it give out money. Jack’s girlfriend had found him lying in bed unresponsive, with “multiple bottles of beer and champagne” in the rubbish bin, so it must have been a hell of a night.

Source

Is SAP Searching In The Clouds?

December 6, 2013 by  
Filed under Computing

Comments Off on Is SAP Searching In The Clouds?

Esoteric business software maker, which no one is really certain what it does, SAP is debating whether to accelerate moving more of its business to the cloud.

The move would be a change in strategy which might initially have only a small impact on its sales. Co-chief executive Jim Hagemann-Snabe said the change would generate more sales by 2017 particularly in markets like the US where there is a big push onto the cloud.

Talking to a Morgan Stanley investor conference this morning, Hagemann-Snabe said that this would have impact on the 2015 level, I don’t expect enormous impact but it would have some impact because you are delaying some revenues. In the long term however it makes a lot of sense, which is not the sort of thing people expect from SAP.

Source

SalesForce Goes Hacking

November 7, 2013 by  
Filed under Computing

Comments Off on SalesForce Goes Hacking

Salesforce.com really wants to attract lots of developers to its Dreamforce conference next month in San Francisco. As in, really.

Last Friday, the cloud software vendor announced a “hackathon” would be held at the conference, with US$1 million going to the developer or team who creates the top prize-winning mobile application with Salesforce.com technology.

“It’s not going to be easy — $1 million is going to bring out the best of the best,” Salesforce.com said in Friday’s announcement. “So don’t wait until Dreamforce! You’re going to want to get started now. With Force.com, Heroku, ExactTarget Fuel, Mobile Services and more — you’ve got a killer array of platform technology to use.”

Salesforce.com will also be providing some “pretty amazing new technology” for use at the show, the announcement adds.

In order to participate, developers have to either register for a full conference pass or a special $99 hacker pass.

The hackathon reflects Salesforce.com’s long courtship of developers to its development technologies, its AppExchange marketplace and recent efforts to build out more tooling for mobile application development.

Developers taking part in the hackathon will have plenty of competition, with some 20,000 programmers expected to attend Dreamforce overall. A “Hack Central” area will be open around the clock, supporting coders who want to work until the wee hours on their application.

In order to qualify, an application can’t have been previously released. The entries will be judged on four criteria counting 25 percent each: innovation, business value, user experience and use of Salesforce.com’s platform.

The second-place finisher will receive $50,000, with $25,000 going to the third-place winner. Fourth and fifth place will get $10,000 and $5,000, respectively.

Some 120,000 people are expected to register for Dreamforce this year. While some of that total will be watching online rather than in person, Dreamforce is now operating at a scale rivaling Oracle’s OpenWorld event, which happened last month.

Source

SAP To Stop Offering SME

November 1, 2013 by  
Filed under Computing

Comments Off on SAP To Stop Offering SME

The maker of expensive esoteric software which no-one is really sure what it does, SAP has decided to pull the plug on its offering for small businesses. Business weekly Wirtschaftswoche said SAP would stop the development of a software dubbed Business By Design, although existing customers will be able to continue to use it.

SAP insists that development capacity for Business By Design was being reduced, but that the product was not being shut down. Business by Design was launched in 2010 and was supposed to generate $1 billion of revenue. The product, which cost roughly 3 billion euros to develop, currently has only 785 customers and is expected to generate no more than 23 million euros in sales this year.

The Wirtschaftswoche report said that ever since the SAP product’s launch, customers had complained about technical issues and the slow speed of the software.

Source

Oracle Goes After SAP’s HANA

October 4, 2013 by  
Filed under Consumer Electronics

Comments Off on Oracle Goes After SAP’s HANA

Oracle has upped its game in its fight against SAP HANA, having added in-memory processing to its Oracle 12c database management system, which it claims will speed up queries by 100 times.

Oracle CEO Larry Ellison revealed the update on Sunday evening during his opening keynote at the Oracle Openworld show in San Francisco.

The in-memory option for Oracle Database 12c is designed to ramp up the speeds of data queries – and will also give Oracle a new weapon in the fight against SAP’s rival HANA in-memory system.

“When you put data in memory, one of the reasons you do that is to make the system go faster,” Ellison said. “It will make queries go faster, 100 times faster. You can load the same data into the identical machines, and it’s 100 times faster, you get results at the speed of thought.”

Ellison was keen to allay concerns that these faster query times would have a negative impact on transactions.

“We didn’t want to make transactions go slower with adding and changing data in the database. We figured out a way to speed up query processing and at least double your transaction processing rates,” he said.

In traditional databases, data is stored in rows, for example a row of sales orders, Ellison explained. These types of row format databases were designed to operate at high speeds when processing a few rows that each contain lots of columns. More recently, a new format was proposed to store data in columns rather than rows to speed up query processing.

Oracle plans to store the data in both formats simultaneously, according to Ellison, so transactions run faster in the row format and analytics run faster in column format.

“We can process data at ungodly speeds,” Ellison claimed. As evidence of this, Oracle demoed the technology, showing seven billion rows could be queried per second via in-memory compared to five million rows per second in a traditional database.

The new approach also allows database administrators to speed up their workloads by removing the requirement for analytics indexes.

“If you create a table in Oracle today, you create the table but also decide which columns of the table you’ll create indexes for,” Ellison explained. “We’re replacing the analytics indexes with the in-memory option. Let’s get rid of analytic indexes and replace them with the column store.”

Ellison added that firms can choose to have just part of the database for in-memory querying. “Hot data can be in DRAM, you can have some in flash, some on disk,” he noted. “Data automatically migrates from disk into flash into DRAM based on your access patterns. You only have to pay by capacity at the cost of disk.”

Firms wanting to take advantage of this new in-memory option can do so straightaway, according to Ellison, with no need for changes to functions, no loading or reloading of data, and no data migration. Costs were not disclosed.

And for those firms keen to rush out and invest in new hardware to take advantage of this new in-memory option, Ellison took the wraps off the M6-32, dubbed the Big Memory Machine. According to Ellison, the M6-32 has twice the memory, can process data much faster and costs less than a third of IBM’s biggest comparable machine, making it ideal for in-memory databases.

Source

IBM Goes Linux

September 27, 2013 by  
Filed under Computing

Comments Off on IBM Goes Linux

IBM reportedly will invest $1bn in Linux and other open source technologies for its Power system servers.

The firm is expected to announce the news at the Linuxcon 2013 conference in New Orleans, pledging to spend $1bn over five years on Linux and related open source technologies.

The software technology will be used on IBM’s Power line of servers, which are based on the chip technology of the same name and used for running large scale systems in data centres.

Previously IBM Power systems have mostly run IBM’s proprietary AIX version of Unix, though some used in high performance computing (HPC) configurations have run Linux.

If true, this will make the second time IBM coughs up a $1bn investment in Linux. IBM gave the open source OS the same vote of confidence around 13 years ago.

According to the Wall Street Journal, IBM isn’t investing in Linux to convert its existing AIX customers, but instead Linux will help support data centre applications driving big data, cloud computing and analytics.

“We continue to take share in Unix, but it’s just not growing as fast as Linux,” said IBM VP of Power development Brad McCredie.

The $1bn is expected to go mainly for facilities and staffing to help Power system users move to Linux, with a new centre being opened in France especially to help manage that transition.

Full details are planned to be announced at Linuxcon later today.

Last month, IBM swallowed Israeli security firm Trusteer to boost its customers’ cyber defences with the company’s anti-hacking technology.

Announcing that it had signed a definitive agreement with Trusteer to create a security lab in Israel, IBM said it planned to focus on mobile and application security, counter-fraud and malware detection staffed by 200 Trusteer and IBM researchers.

Source

Oracle Issues Massive Security Update

July 29, 2013 by  
Filed under Computing

Comments Off on Oracle Issues Massive Security Update

Oracle has issued its critical patch update advisory for July, plugging a total of 89 security holes across its product portfolio.

The fixes focus mainly on remotely exploitable vulnerabilities in four widely used products, with 27 fixes issued for the Oracle Database, Fusion Middleware, the Oracle and Sun Systems Product Suite and the MySQL database.

Out of the 89 security fixes included with this update, the firm said six are for Oracle Database, with one of the vulnerabilities being remotely exploitable without authentication.

Oracle revealed that the highest CVSS Base Score for these database vulnerabilities is 9.0, a score related to vulnerability CVE-2013-3751, which affects the XML Parser on Oracle Database 11.2.0.2 and 11.2.0.3.

A further 21 patched vulnerabilities listed in Oracle’s Critical Patch Update are for Oracle Fusion Middleware; 16 of these vulnerabilities are remotely exploitable without authentication, with the highest CVSS Base Score being 7.5.

As for the Oracle and Sun Systems Products Suite, these products received a total of 16 security fixes, eight of which were also remotely exploitable without authentication, with a maximum CVSS Base Score of 7.8.

“As usual, Oracle recommends that customers apply this Critical Patch Update as soon as possible,” Oracle’s director of Oracle Software Security Assurance Eric Maurice wrote in a blog post.

Craig Young, a security researcher at Tripwire commented on the Oracle patch, saying the “drumbeat of critical patches” is more than alarming because the vulnerabilities are frequently reported by third parties who presumably do not have access to full source code.

“It’s also noteworthy that […] every Oracle CPU release this year has plugged dozens of vulnerabilities,” he added. “By my count, Oracle has already acknowledged and fixed 343 security issues in 2013. In case there was any doubt, this should be a big red flag to end users that Oracle’s security practices are simply not working.”

Source

« Previous PageNext Page »