Forrester estimates that organizations will spend $800 million in hadoop and its related services in 2017. The major force exerting pressure on Hadoop is cloud.With firms no longer showing interest in dangling with hadoop’s on-premise complexity, organizations are more keen on shifting their hadoop stacks to the public cloud.This implies that the revenue of hadoop vendors will see a drastic shift from on-premise to the cloud. Just like Hadoop is not designed for the cloud, it is not meant for doing matrix math that deep learning requires. According to Forrester , Hadoop has a strong future and will witness strong growth in the next 2 to 3 years.
(Source : http://www.zdnet.com/article/the-cloud-is-disrupting-hadoop/)
At LinkedIn , distributed clusters run backend metrics , power experiments, and drive production data products which are used by millions of people.Multiple users interact with hadoop through dozens of stacks every day and there are multiple data flows which move data.LinkedIn created Dr. Elephant to tackle with the hassle of a tidy hadoop cluster.Dr. Elephant helped developers identify poorly executing jobs and find out the root cause of the problem.Dr. Elephant also provides hadoop cluster users with recommendations on how to enhance the performance of a hadoop job without affecting the overall efficiency of the hadoop cluster. Pepperdata, one of its commercial vendor announced the release of a new product based on Dr. Elephant known as Application Profiler. Application Profiler along with Dr. Elephant will help hadoop shops boost their DevOps cycles.
(Source : https://www.datanami.com/2017/03/07/dr-elephant-steps-cure-hadoop-cluster-pains/ )
If you would like more information about Big Data and HadoopTraining, please click the orange "Request Info" button on top of this page.
Apache Hadoop played a vital role for organizations over the last 4 to 5 years in handling the growing amount of data. Analysts forecast the global Hadoop market to grow at a compound annual growth rate of 59.37% during the period 2016-2020. The global hadoop market reports highlight the key hadoop vendors in the market by 2020 as Cloudera, MapR, Hortonworks, AWS, CISCO, Datameer, IBM,Microsoft, The report as well unveils hadoop market growth by 2020 covering different parameters like challenges, trends and various factors driving hadoop market growth.
(Source : http://www.satprnews.com/2017/03/08/hadoop-market-2020-in-depth-research-on-dynamics-trends-outlook-emerging-growth-factors-and-forecasts-by-key-profiles-cloudera-hortonworks-mapr/ )
With Trump enforcing immigration laws, big data is all set to play a vital role in the immigration policy agenda.Palantir Technologies earlier this month designed an intelligence system known as Investigative Case Management (ICM) which will concern all US citizens who value civil liberties. ICM will provide ICE agents access to large amounts of data to help immigration officials discover targets and after that create and administer cases against them.
(Source : https://www.cato.org/blog/big-data-tools-trumps-big-government-immigration-plans )
Many enterprises announced the release of their novel big data solutions at the Strata +Hadoop world conference held in San Jose this week.
i) MapR unveiled its new big data solution MapR edge that will capture , process and analyse data from IoT devices and provide quick aggregation of insights.
ii) Cask announced the release of Cask Data Application Platform (CDAP) 4.1 featuring HEDIS healthcare reporting, EDW offload and event condition action framework for IoT along with resilience, manageability and enterprise level security.
iii) Zaloni introduced Data Lake in a Box.
iv) Dell EMC to offer a new Dell EMC Ready Bundle for Hortonworks Hadoop to bridge big data skills gap and help organizations streamline architecture with the new offering to design , plan and configure hadoop environments.
(Source: http://sdtimes.com/strata-hadoop-world-mapr-edge-zaloni-data-lake-box-dell-emc-ready-bundle-hortonworks-hadoop/#sthash.aabZV6sk.dpuf )
New test results released using the industry-standard TPC-DS benchmarks at the Strata +Hadoop world conference reveal that Kognitio on Hadoop returned better results with greater overall consistency when compared to big data SQL engines like Impala and Spark. Kognitio on Hadoop whilst being more reliable , it also outperformed Apache Spark by returning results that were 178.5 times faster and upto 30.4 times faster than Cloudera Impala.
(Source : https://globenewswire.com/news-release/2017/03/14/936292/0/en/Kognitio-on-Hadoop-Outperforms-Spark-Impala-in-Industry-Standard-Benchmarking-Tests.html )
University of North Texas has selected Attunity Replicate, a leading provider of big data management software solutions to implement hadoop based data lake as part of their real-time analytics initiative focussed on their students. The solution will enhance agility by providing access to real-time student domain analytics thereby helping the university increase the number of enrollments, retentions and enhance student career perspectives. The new Attunity solution will be demonstrated at Strata + Hadoop world conference to be help in San Jose, CA
(Source : http://www.prnewswire.com/news-releases/university-of-north-texas-selects-attunity-to-enable-hadoop-data-lake-for-strategic-analytics-initiative-616104413.html )
The leading provider of Big-Data-as-a-Service platform today announced their performance results. The Intel benchmarking study showed that it is possible to deliver the benefits of containerization for big data workloads without having to compromise in performance. By leveraging the power of docket containers , the BlueData EPIC software platform makes it faster, easier and cost-effective to deploy big data infrastructure and applications including Kafka, Spark, Hadoop , Cassandra and others - whether in the public cloud, on-premise or within a hybrid architecture.
(Source : http://www.marketwired.com/press-release/bluedata-announces-bare-metal-performance-for-hadoop-on-docker-containers-2203091.htm)
SAP scooped up Altiscale in 2016 to add scalability and performance to its BDaaS solution. Altiscale brought in a cloud infrastructure to SAP that provides both Hadoop and Spark as a service.This has helped reduce the complexity and help customers focus on these technologies to extract business value instead of having to focus on operating them and this is what customers want “Hadoop as a Service”. SAP is now opening a data centre in Europe to fill the increasing demand for Hadoop as a service. The BDaaS includes Hive, Hadoop and Spark. It runs Hadoop HDFS and YARN from Hadoop and integrated Hive and Spark as a service.
(Source : http://siliconangle.com/blog/2017/03/15/hadoop-as-a-service-allows-companies-to-focus-on-business-bigdatasv/ )
Large volumes of data is generated by various electronic devices used in different end use segments like Retail, Banking, Finance, Insurance, Healthcare and public utilities. Hadoop is considered to be an excellent alternative to human resources for efficient data analysis as it exppands its array of end use industries. Hadoop market is anticipated to grow at a compound annual growth rate of 26.3% from 2015 to 2023 and reach a valuation of 2429 million USD by 2023.According to the Global Hadoop Market report by Transparency Market Research , BFSI will emerge as a key contributor to growth with Asia Pacific region leading the Hadoop Market.
(Source : http://www.transparencymarketresearch.com/hadoop-market.html)
According to a recent IDC report, worldwide revenues for big data and analytics will reach $150.8 billion by 2017 , 12% increase from 2016.Banking, process manufacturing, discrete manufacturing , central government and professional services industries will see largest investments in big data and analytics products in 2017.It is estimated that these 5 industries together will spend $72.4 billion on BDA solution in 2017 and will become the largest spenders by 2020 with investment totalling up to $101.5 billion.
(Source : https://www.information-management.com/news/big-data-and-analytics-see-double-digit-growth-through-2020)
Chronic Myeloid Leukaemia is incurable and lifelong treatment requires patients to live with side effects with chances of drug resistance arising. As the number of CML patients increase, there is strain on improving health services. Bloodwise and Scottish Cancer Foundation have created LEUKomics which brings together a CML gene expression data from various specialised laboratories across the globe along with the data from University of Glasgow. The intention is to eliminate all the bottlenecks around big data analysis in CML. Manual quality checks are performed and computational processing is done to extract information on gene expression from each dataset. This will help them interpret data that was not easily accessible previously to clinicians or academics without being trained in specialised computational approaches.
(Source : http://www.dddmag.com/article/2017/03/how-big-data-being-mobilized-fight-against-leukemia )
The global energy industry faces several challenges like balancing the increasing demand among developing nations with need for sustainability and predicting weather how extreme weather conditions on supply and demand.To curb these challenges, GE Power that supplies 30% of world’s electricity through its generators and turbines is making use of big data and machine learning to build and Internet of Power.This will GE Power replace the linear, one way traditional model of energy delivery.
(Source : https://www.forbes.com/sites/bernardmarr/2017/03/28/the-amazing-way-ge-is-combining-big-data-and-electrons-to-create-the-internet-of-energy/#c98268818062 )
According to Telsyte Australian Big Data & Analytics Market Study 2017 , 83% of CIO’s plan to invest more on big data analytics in 2017. Of which, 30% of the organizations using big data or are planning to use big data is for predictive analytics.Big data processing is among one of the top 5 use cases for high performance computing and analytics.The use of big data and analytics is likely to be high across a range of applications like - financial modelling, retail sales, fraud detection, e-commerce, IoT, and customer interaction.
(Source : https://channellife.com.au/story/australian-companies-spending-big-data-2017/ )