Hortonworks is going a step further in making Hadoop more reliable when it comes to enterprise adoption. Hortonworks Data Platform 2.4, which recently became available, is equipped to be regularly updated. Hortonworks has decided that they will update the core Hadoop components - HDFS, MapReduce, YARN and Zookeeper annually, but extensions like Spark, Hive, HBase and Ambari will be periodically updated.
Syncsort has made it easy for mainframe data to work in Hadoop and Spark by upgrading its DMX-h data integration software. Syncsort has delivered this because some of the companies in industries like financial services, banking, and insurance needed to maintain their mainframe data in native format.
Cloudera – the global provider of the easiest and the most secure data management to be built of Apache Hadoop, recently announced that recently it has moved from the Challengers to the Visionaries position in the 2016 Gartner Magic Quadrant for Data Warehouse and Data Management solution for analytics.
Hadoop is emerging as the framework of choice while dealing with big data. It can no longer be classified as a specialized skill, rather it has to become the enterprise data hub of choice and relational database to deliver on its promise of being the go to technology for Big Data Analytics. (Source: http://www.networksasia.net/article/hadoop-powering-next-generation-analytics.1457694542 )
For the complete list of big data companies and their salaries- CLICK HERE
If you would like more information about Big Data careers, please click the orange "Request Info" button on top of this page.
Altiscale, a company which has always been in the forefront about making the adoption of Hadoop easier and reducing complexity – recently launched a cloud service called Insight Cloud. This will make Hadoop easier to access for business users. Insight Cloud provides services for data ingestion, processing, analysing and visualization.
Hadoop currently has over a 100 open source projects running on the ecosystem. It is getting difficult to choose which Hadoop open source project will work for your big data needs and which will not. To address this issue, James Casaletto, Solutions Architect at MapR, will speak on “Harnessing the Hadoop Ecosystem”, at the 2016 Data Summit at NYC.
Commvault’s eleventh software release is all about enhancing its integrated solutions portfolio to better support Big Data initiatives. Commvault’s new technology will be supporting various big data environments like Hadoop, Greenplum and GPFS. This new technology is a direct result of the need to enhance data storage, analysis and customer experience.
Badoo’s Data Architect, Demeter Sztanko said that the big data growth at the firm was staggering and likely to grow at 5% every month. With more than 300 million users, it is not surprising. Sztanko announced at Computing’s 2016 Big Data & Analytics Summit that, they are using a combination of Big Data tools to tackle the data problem. Badoo uses Hadoop for batch processing and EXASOL’s analytics database.
Galactic Exchange is set to launch the beta version of its product, Cluster GX, which is an open source software which will get the Hadoop cluster running in 5 mins. Anyone can download ClusterGX and it is designed to run on all major operating systems, Windows, Linux, and Mac OS.
In the conference, the big data world is eagerly awaiting to discuss the top 7 things that will bring disruptions in the market. Spark adoption is all a rage and streaming and real time data processing is the talk of the hour. Visualization and SQL-on-Hadoop is entering the mainstream. Hadoop adoption and production still rules the big data space.
There are several big data tools and services being displayed at the Strata World/Hadoop Conference happening in San Jose. Many novel distributions of open source database technologies and handy tools to manage them like Looker Blocks, Tamr data unification platform and many more are on display this week at the Hadoop World show.
Four years ago Centrica was struggling hard on how to deal with the exponential increase in big data. To tackle this Centrica built a Hadoop data lake with 250 nodes for less than 1 million pounds to merge data from structured and unstructured data sources.
With gaining Hadoop enterprise adoption, many organizations are yearning to power their BI operations to deliver meaningful analytics they can work with. However, with existing BI tools it is very difficult to enable interactive queries on Hadoop. Moreover, to connect BI tools with the data present in Hadoop they need to use custom drivers. To solve this problem, AtScale has developed a new Hybrid Query service that does not require installing customer drivers but rather supports SQL and MDX natively.