Big Data

How Big Data Has Changed Politics

How big data has changed politics

Data isn’t new, but we often forget that. In fact, even Sherlock Holmes recognized the power of data, as we can tell from one of his most famous quotes: “It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.” This is especially relevant to us in an age of fake news and in which politicians are arguably being held more accountable than ever thanks to our newfound ability to process and understand data. Politicians are finding that it’s more and more difficult to “twist facts to suit theories” when thousands of wannabe Holmeses are taking to Reddit to debunk them. Data is so important these days that it’s overtaken oil as the world’s most valuable resource, which of course means it’s a hot topic amongst politicians. Governmental organisations are learning to understand and to deal with data at regional, national and international levels, not because they want to but because they have to. Data is just that important.

Big Data Futuristic Visualization Abstract Illustration

Nasscom checks into Guiyang for analytics, big data projects

After striking a partnership with Chinese city of Dalian – a technology hub – last year, IT industry body Nasscomis going to strike a second partnership with the city of Guiyang on Monday to focus on collaboration in Big Data and Analytics. As part of the partnership with the Guiyang Municipal government, agreements worth 25 million Yen between Chinese customers and Indian service providers are also going to be announced. The pilot projects, launched on the Sino Indian Digital Collaborative Opportunities Plaza (SIDCOP) platform, would be executed over the next year. In the coming months, an IT Corridor within Guiyang HiTech city will be created to specifically promote Big Data and Analytics projects. The government of Guiyang is not only offering a host of policy benefits and incentives such as rentfree office space along with other tax concessions but will aid Indian service providers – who are members of Nasscom— in bagging government contracts.

Why Data Science Must Stand At The Forefront Of Customer Acquisition

Why data science must stand at the forefront of customer acquisition

Though flashy advertising campaigns tend to get most of the attention, in reality, data science is what will ultimately have the biggest impact in ensuring effective customer acquisition. Just how valuable can big data be for a business? Some analysts believe that simply increasing data accessibility by 10 percent can help the average Fortune 1000 company generate an additional $65 million in income. Quality data can be even more valuable for new startups. When you have a limited marketing budget, you can’t afford to let your customer acquisition efforts go to waste — especially when in many industries, companies spend hundreds of dollars to acquire a single customer. Unlocking the potential of data science and analytics will enable you to gain greater insights into your customers, allowing you to spend your marketing budget more efficiently.

big data

17 best data science bootcamps for boosting your career

Data scientist is the best job in America, according to a survey on Glassdoor, and it consistently makes the top of the list year after year. With a job score of 4.8 out of 5, a median base salary of $110,000 per year and over 4,500 current job openings, it’s a great time to be a data scientist. But, as the role of data scientist grows in demand, traditional schools aren’t churning out qualified candidates fast enough to fill the open positions. There’s also no clear path for those who have been in the tech industry for years and want to take advantage of a lucrative job opportunity. Enter the bootcamp, a trend that has quickly grown in popularity to train workers for in-demand tech skills. Here are 17 of the best data science bootcamps, designed to help you brush up on your data skills, with courses for anyone from beginners to experienced data scientists.

Reality Check The Complexities Of Mapping The World In 2018

Reality check: the complexities of mapping the world in 2018

as a critical component of purchasing precise and relevant third-party data. Across the thousands of data sets I’ve examined over my career, many have contained major gaps or inaccuracies not initially apparent on the surface. However, vetting these data sets is just one of the challenges organizations face today. What if there is no data to even buy? What if no creditable information exists in the areas you need to know more about? With the exponential rise of physical, digital, mobile, and transactional data, many believe that complete, up-to-date, and reliable data—about anyone, anything, or any place—is readily available. Well, I’m here to tell them that they’re wrong. This information is simply not as obtainable as they think. Exploring the demand for data: When you look at the origin of the data being collected by today’s businesses, it its being generated by people, connected devices, and activities.

server-2160321_1920

Shed light on your dark data before GDPR comes into force

Also referred to as unstructured data, dark data is growing at a rate of 62% per year, according to IDG. By 2022, they say, 93% of all data will be unstructured. Gartner defines dark data as, “the information assets organizations collect, process and store during regular business activities, but generally fail to use for other purposes”. Consisting of data from a huge variety of sources – emails, documents, instant messages, digital media posts, partly developed applications – or just information which isn’t being used or analyzed, its nomenclature makes it sound foreboding. With new regulations such as the GDPR coming into force, businesses must gain a clear understanding of the data they hold. For structured data, this is straightforward. But dark data is much harder to manage, stored across a distributed IT environment with no single owner. A ‘bottomless lake of data’: Dark data tends to be text-based data, as well as video, audio files and images.

tableau

Tableau Enables Developers to Extend Functionality with New Extensions API

Tableau Software, Inc. (NYSE: DATA), a leading analytics platform, today announced the general availability of Tableau 2018.2. This release introduces a new Extensions API that enables customers to drag and drop third party functionality directly into a dashboard. Publicly available extensions can be found in the new Extension Gallery at tableau.com/extensions. Additionally, Tableau customers can now more easily administer Tableau Server directly in the browser, thanks to the new Tableau Services Manager. Tableau 2018.2 also includes the ability to join data sources based on spatial data, and brand new mobile dashboard formatting tools to improve mobile consumption of analytical content. Dashboard Extensions Gives Customers New Functionality: The new Extensions API enables customers to drag and drop third party functionality directly into Tableau.

Persistent Systems invests in big data firm Cazena

Persistent Systems invests in big data firm Cazena

Persistent Systems has announced an undisclosed investment in Cazena, a big dataas a service firm. The investment was part of a $10mn round, that included other investors like Andreessen Horowitz, Formation 8 and Northbridge Venture Partners, among others. As part of the investment round, Cazena has partnered with Persistent Systems to further extend the cloud Data Lake to include self-service data discovery and data services for enterprises that don’t have data engineering or machine learning (ML) skills. “Persistent consistently drives forward rapid digital transformation for our enterprise clients. Partnering with Cazena enables us to extend innovative solutions for analytics and ML for our customers. We are excited to announce the new Self-Service Data Lake, a combination of Cazena’s Big Data as a Service with Persistent’s ShareInsights analytic solutions and our ML/AI data services.

big data boost for infrastructure sector

Now, a big data boost for infrastructure sector

Researchers from Indian Institutes of Technology (IIT)-Madras and Bombay along with Harvard University are using big data to build India’s first comprehensive database on infrastructure projects called Integrated Database on Infrastructure Projects (IDIP). The mapping of infrastructure projects will include the entire infrastructure sector and all types of central, state projects and public-private partnership (PPP) ventures. It will also cover all phases of the project-—project design and formulation, development, construction as well as operation. “This is being envisaged in Integrated Database on Infrastructure Projects in India which is likely to be launched in a month’s time,” said Thillai Rajan A, a faculty of the department of management studies at IIT Madras. The researchers are also collaborating with the National Highways Authority of India(NHAI), state governments and private sector infrastructure developers.

Effective artificial intelligence requires a healthy diet of data

Effective artificial intelligence requires a healthy diet of data

In the current technology landscape, nothing elicits quite as much curiosity and excitement as artificial intelligence (AI). And we are only beginning to see the potential benefits of AI applications within the enterprise. The growth of AI in the enterprise, however, has been hampered because data scientists too often have limited access to the relevant data they need to build effective AI models. These data specialists are frequently forced rely solely on a few known sources, like existing data warehouses, rather than being able to tap into all the real-time, real-life data they need. In addition, many companies have great difficulty efficiently and affordably determining the business context and quality of massive amounts of data instantly. Given these difficulties, it’s easy to understand some of the historical barriers to AI acceleration and adoption.