1-844-696-6465 (US)        +91 77600 44484        help@dezyre.com

Recap of Hadoop News for June 2018

News on Hadoop - June 2018

Apache Hadoop News 2018

RightShip uses big data to find reliable vessels.HoustonChronicle.com,June 15, 2018.

RightShip is using IBM’s predictive big data analytics platform to calculate the likelihood of compliance or mechanical troubles that an individual merchant ship will experience within the next year.It also leverages big data to analyse carbon emissions and vessel efficiency. RightShip has been successful in removing more than 1000 high risk vessels from customer supply chains in 2017. It achieves this by giving ratings to the ships which are determined by coupling big data and the knowledge of the superintendent. The rating system gives one star rating to ships that are likely to experience an incident in the next year and a five star rating to ships which are least likely to do so. The company analyzes data from more than 50 sources which include vessel operators, trade associations, and government agencies. This data gives RightShip a complete vessel history including casualties, inspections, incidents, and owners. The rating system can be customized for an individual company’s risk appetite.

(Source - https://www.houstonchronicle.com/business/article/RightShip-uses-big-data-to-find-reliable-vessels-12994358.php )

Online Hadoop Training

Hortonworks Data Platform turns 3.0; new cloud partnerships announced. Zdnet.com, June 18, 2018.

HDP hits its major milestone as it turns 3.0,a version that is completely based on 3.1 version of Apache Hadoop. Some key features that are included in HDP 3.0 are as follows -

  1. A new feature that needs to be highlighted for the elephant in the container is that the jobs dispatched to the YARN resource manager can now contain the entire Docker container images.
  2. HDP 3.0 also includes support for graphics processing units to execute hadoop jobs  that involve AI and Deep learning workloads.
  3. It also includes support for Hive 3.0 which features integration with Druid, a column store data access and storage system for OLAP querying of time series data. It is a win-win situation for both Hive and Druid as Druid gets a SQL query abstraction while Hive gets an interactive column store BI engine.

Apart from HDP 3.0 release , Hortonworks has added another feather to its cap with three  cloud-focused partnership announcements - IBM, Microsoft and Google.

(Source - https://www.zdnet.com/article/hortonworks-data-platform-turns-3-0-new-cloud-partnerships-announced/ )

Big Data Hadoop Projects

Mind the Gap: Coping with Big Data’s Big Costs.EnterpriseTech.com, June 18, 2018

According to a study by HBR, companies that leverage big data analytics on average  earn profit gains and operational enhancements that are 5 to 6 percent higher than their competitors. The revenue generated by big data and business analytics is likely to cross the $200 billion mark by 2020.However, there is a huge gap(the big data activation gap) between having access to raw big data and the point at which the data can produce quantifiable value for the business. The first step towards  bridging the big data activation gap is to identify the true value of big data for the business and when and how the teams need access to big data. Here are few important points to consider to cope up with the big data’s big costs and help the company activate it’s big data completely -

  • Organizations need to pinpoint the destination and be specific on what they want to minimize costs. Whether it is ad hoc analysis or machine learning, the application areas will help organizations  decide on the right big data activation strategy.
  • Having outlines the business requirements for big data, it is extremely important to decide on the right data tools to ensure wide access of data.
  • Right tools are essential but so is having a capable team who can make use of those data tools. Organizations need to bridge the talent gap by maximizing on the limited resources they have.

(Source - enterprisetech.com/2018/06/18/mind-the-gap-coping-with-big-datas-big-costs/ )

HOW INDIAN GOVERNMENT IS USING BIG DATA ANALYTICS TO IMPROVE ECONOMY AND PUBLIC POLICY. AnalyticsInsight.net, June 28, 2018

2017 will be remembered as an year to reckon in India for beginning its big data journey. The Modi government reached out to many data mining experts to sneak-peak into the data and derive insights from it which could help the government with relevant results to work upon.Economists in India have used data patterns to analyze GSTM so that they can understand how trade happens across the states.Big data analytics today in India is being considered as a cost-effective and easy way to get the work done for the government. Introduction of GST and demonetization are an outcome of the data driven policies which the Modi government has infused into the systems. Many actions take by the Indian government clearly reveal the use of big data analytics.The recent budget announcements and keen interest reveal the adoption of AI and other big data policies by the government.We can wait and watch , how big data will bring in social and economic change for the upcoming years.

(Source -  https://www.analyticsinsight.net/how-indian-government-is-using-big-data-analytics-to-improve-economy-and-public-policy/ 

Big data analytics: No big money needed as most solutions go 'freemium'.Indiatimes.com, June 29, 2018.

Cloud has enabled cash-constrained small and medium enterprises to make the best use of latest technologies like big data analytics whose benefits earlier could only be reaped by large enterprises. SMEs do not have enough cash flow to make huge investments in large technologies but the requirements and expectations they have are same as that of large enterprises. Companies like IBM and Oracle are making it possible for SMEs to make the most out of technology by providing their technological offerings on cloud, thus removing the biggest barrier between SMEs and large enterprises. Every product that Oracle has today is available on cloud that was earlier only available as an on-premise solution.Oracle has recently released an autonomous warehouse that make cloud based offering available to all clients making it easier for SMEs to leverage them for faster implementation of ideas with limited resources they have. Most of these cloud based tools are made accessible to the companies on freemium model basis so that they can experiment with them as desired.

(Source- //economictimes.indiatimes.com/articleshow/64789553.cms?utm_source=contentofinterest&utm_medium=text&utm_campaign=cppst

Future Demands of Hadoop and Big Data Analytics Market: Analysis, Growth, Application, Trends till 2023.thefreenewsman.com, June 29, 2018

Hadoop and Big Data Analytics Market is anticipated to reach $40 billion by end of 2022 with compound annual growth rate of 43% during 2018 -2022. A leading market research firm, QY reports has analyzed the hadoop market hierarchy by performing SWOT analysis of the major players in the big data analytics market.The research report provides in-depth analysis of revenue, market share and important market segments across various geographic regions and the big data trends.You can get a detailed copy of the report from qyreports.com.

(Source - //thefreenewsman.com/future-demands-of-hadoop-and-big-data-analytics-market-analysis-growth-application-trends-till-2023/219803/ )

Online Hadoop Training

PREVIOUS

NEXT