News on Hadoop-January 2017
Big Data In Gambling: How A 360-Degree View Of Customers Helps Spot Gambling Addiction. Forbes.com, January 5, 2017.
The largest gaming agency in Finland, Veikkaus is using big data to build a 360 degree picture of its customers. Veikkaus merged with Fintoto(Horse Racing) and Ray(Slots and Casinos) in January 2017 to become the largest gaming organization in Europe. Veikkaus has developed a modern data architecture by pulling data from both digital and offline betting channels. The data architecture is based on open source standards Pentaho and is used for managing, preparing and integrating data that runs through their environments including Cloudera Hadoop Distribution, HP Vertica, Flume and Kafka.
(Source : http://www.forbes.com/sites/bernardmarr/2017/01/05/big-data-in-gambling-how-a-360-degree-view-of-customers-helps-spot-gambling-addiction/#511f592f3a29)
How Hadoop helps Experian crunch credit reports. CIO.com, January 5, 2017
Experian implemented a novel data analytics system that takes just few hours of time to process petabytes of data from hundreds of millions customers worldwide instead of months. Experian deployed a new software called data fabric layer based on open source Hadoop along with an API platform and microservices which will help consumers and corporate customers access credit reports and information quickly.
(Source: http://www.cio.com/article/3155048/analytics/how-hadoop-helps-experian-crunch-credit-reports.html )
If you would like more information about Big Data Training, please click the orange "Request Info" button on top of this page.
5 Hadoop Trends to Watch in 2017. Datanami.com, January 6, 2017
Where is the powerful distributed computing platform heading to in 2017? Datanami highlights the top 5 Hadoop Trends to Watch Out for in 2017 -
i) Though there are rumors of Hadoop’s demise ,however, numbers back up the claim that the usage of Hadoop is expanding and not shrinking.
ii) AtScale survey reveals that more than half of the organizations having big data solutions living on the cloud today that is likely to increase to 3/4th. The future of Hadoop is cloudy.
iii) Machine Learning automation sees a breakthrough in 2017.
iv) Companies building big data solutions on hadoop will focus on data governance and security menace as a frontier of their big data initiatives in 2017.
v) In 2017, we might think of big data as a data fabric. Data fabric concept unites important aspects of data management , security, and self-service aspect in big data platforms.
(Source : https://www.datanami.com/2017/01/06/5-hadoop-trends-to-watch-in-2017/)
Database Ransom Attacks Hit CouchDB and Hadoop Servers. BleepingComputer.com, January 18,2017.
Some unknown groups of cyber criminals wiped data from Hadoop and CouchDB databases asking for a ransom fee to return back the stolen files and in some cases , demolishing the data just for fun. This attack took place soon after MongoDB database was hijacked in the beginning of 2017 and data was held for ransom. After the attack on MongoDB servers, security experts have predicted that other database servers would be hit as well.
(Source: https://www.bleepingcomputer.com/news/security/database-ransom-attacks-hit-couchdb-and-hadoop-servers/ )
EIT Digital begins work on Hadoop open source product and start-up to take innovation to market.Vanillaplus.com, January 20,2017.
The European Open Innovation organization, EIT Digital has launched a novel activity referred to as “HopsWork”. HopsWork is launched to work on a next generation Hadoop framework for processing large datasets. HopsWork will help organizations host several sensitive datasets on the same hadoop cluster for dynamic role based access control for HDFS and Kafka. Hops platform will also provide best-in-class support for Spark Streaming and Flink.
(Source : http://www.vanillaplus.com/2017/01/20/24755-eit-digital-begins-work-hadoop-open-source-product-start-take-innovation-market/ )
Bringing Hadoop to the mainframe. Gigaom.com, January 23, 2017.
60% or more of the enterprise transactions take place on Mainframes built by IBM and other competitors such as Unisys, Bull, Hitachi, and Fujitsu.However, with increasing number of organizations adopting Hadoop in the enterprise , the time and cost incurred in extracting data from Mainframe based apps has become a major cause of concern. IBM is bringing mainframe to hadoop with more effective ways so that the enterprises using mainframes benefits from both the cost-effectiveness and speed of modern data analytics platform like Hadoop and also from the existing investment in mainframe infrastructure.
(Source : https://gigaom.com/2017/01/23/report-bringing-hadoop-to-the-mainframe/ )
Tax department leans on Big Data to mark out multiple PAN holders.EconomicTimes.com, January 29,2017.
IT department will use big data analytics to plug into the tax loopholes. Using the data about mobile numbers, common address and email id’s , the IT department will establish a relationship between people having multiple PAN’s so that they can easily track down evaders. IT department will collaborate with various private firms to analyse the voluminous data collected after demonetisation to identify any relationships between PAN holders. This information would be integrated and matched with other IT databases like tax payments, TDS, third-party reporting, etc. to build a detailed profile of the taxpayer.
(Source : http://economictimes.indiatimes.com/news/economy/policy/tax-department-leans-on-big-data-to-mark-out-multiple-pan-holders/articleshow/56850255.cms )
Hadoop vendors make a jumble of security.InfoWorld.com, January 30,2017.
Primary hadoop vendors are getting serious about security but the major concern with big data security management is the lack of standardization. Many hadoop vendors are turning to hadoop security in various ways which could turn the tiny toy elephant in the big data room into potential pitfall for vendor lock-in. The competing approaches to attain hadoop security could result in poorer security measures in the short term , particularly for organizations that deploy more than one hadoop stack. With huge investments from on-premise and cloud hadoop vendors , hadoop security is a prime concern that needs to be addressed and standardized.
(Source : http://www.infoworld.com/article/3162399/analytics/hadoop-vendors-make-a-jumble-of-security.html )