Saturday July 21, 2018
Home Lead Story Twitter Moves...

Twitter Moves its Core Infrastructure to Google Cloud

The Hadoop compute system is the core of Twitter's data platform, and the company runs multiple large Hadoop clusters that are among the biggest in the world.

0
//
23
Google
Google, while announcing this at an event in Thailand, did not elaborate when this option would be available on iOS devices as well as reach Western markets. Pixabay
Republish
Reprint

Twitter will move some of its core infrastructure to Google’s Cloud Platform for better data management, the company has announced.

“We are excited to announce that we are working with Google Cloud to move cold data storage and our flexible compute Hadoop clusters to Google Cloud Platform,” Parag Agrawal, Chief Technology Officer at Twitter, said in a blog post on Thursday.

“This will enable us to enhance the experience and productivity of our engineering teams working with our data platform,” he added.

"We are excited to announce that we are working with Google Cloud to move cold data storage and our flexible compute Hadoop clusters to Google Cloud Platform," Parag Agrawal, Chief Technology Officer at Twitter, said in a blog post on Thursday.
Twitter logo, Pixabay

Hadoop from Apache is an open-source software for organising Big Data.

The Hadoop compute system is the core of Twitter’s data platform, and the company runs multiple large Hadoop clusters that are among the biggest in the world.

“In fact, our Hadoop file systems host more than 300PB of data across tens of thousands of servers,” Agrawal said.

Google Cloud Platform’s data solutions and trusted infrastructure will provide Twitter with the technical flexibility and consistency that its platform requires.

Also Read: Facebook Ensuring Its AI System to be Equally Neutral For All

The migration, when complete, will enable faster capacity provisioning, increased flexibility, access to a broader ecosystem of tools and services and improvements to security.

“Architecturally, we will also be able to separate compute and storage for this class of Hadoop workloads, which has a number of long-term scaling and operational benefits,” the post read. (IANS)

 

Click here for reuse options!
Copyright 2018 NewsGram

Next Story

Tech Giants to join Data Transfer Project (DTP) To Help Users Manage Data

The Data Transfer Project uses services' existing APIs and authorisation mechanisms to access data. It then uses service specific adapters to transfer that data into a common format, and then back into the new service's API.

0
tech giants
According to Google, the project will let users "transfer data directly from one service to another, without needing to download and re-upload it". (Wikimedia Commons)

To help billions of users manage their data and help them transfer that into and out of online services without privacy issues, four tech giants — Facebook, Google, Microsoft and Twitter — on Friday announced to join the open source initiative called Data Transfer Project (DTP).

In the early stages at the moment, the Data Transfer Project will help users of one service to use their data to sign up for another service with encryption.

“Using your data from one service when you sign up for another still isn’t as easy as it should be. Today we’re excited to announce that we’re participating in the Data Transfer Project,” said Steve Satterfield, Privacy and Public Policy Director at Facebook in a statement.

The initiative comes at a time when data-sharing is making headlines — be it the massive Cambridge Analytica data scandal or third-party apps accessing users’ data at various platforms — amid countries announcing new data-protection laws like the European General Data Regulation Protection (GDPR).

Moving data between any two services can be complicated because every service is built differently and uses different types of data that may require unique privacy controls and settings.

“For example, you might use an app where you share photos publicly, a social networking app where you share updates with friends, and a fitness app for tracking your workouts,” said Satterfield.

tech giants
Moving data between any two services can be complicated because every service is built differently. Pixabay

“These are the kinds of issues the Data Transfer Project will tackle. The Project is in its early stages, and we hope more organisations and experts will get involved,” he added.

The Data Transfer Project uses services’ existing APIs and authorisation mechanisms to access data. It then uses service specific adapters to transfer that data into a common format, and then back into the new service’s API.

According to Google, the project will let users “transfer data directly from one service to another, without needing to download and re-upload it”.

The tech giants also released a white paper on this project.

“The future of portability will need to be more inclusive, flexible, and open. Our hope for this project is that it will enable a connection between any two public-facing product interfaces for importing and exporting data directly,” read the white paper.

According to Damien Kieran, Data Protection Officer at Twitter, right now, much of the online products and services we use do not interact with each other in a coherent and intuitive fashion.

“Information that is housed on one platform cannot be easily and securely transferred to other services. This is not a positive collective experience for the people who use our services and we are keen to work through some of the challenges as an industry,” Twitter said.

Also Read-Google, Facebook Have Been Using “Dark Patterns”: Report

The Data Transfer Project was formed in 2017 to create an open-source, service-to-service data portability platform so that all individuals across the web could easily move their data between online service providers whenever they want. (IANS)