Dec 05, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Complex data  Source

[ AnalyticsWeek BYTES]

>> A Gentle Introduction to Linear Regression With Maximum Likelihood Estimation by administrator

>> Jun 28, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> Human-Centric Artificial Intelligence: What and Why? by analyticsweekpick

Wanna write? Click Here

[ FEATURED COURSE]

R, ggplot, and Simple Linear Regression

image

Begin to use R and ggplot while learning the basics of linear regression… more

[ FEATURED READ]

Rise of the Robots: Technology and the Threat of a Jobless Future

image

What are the jobs of the future? How many will there be? And who will have them? As technology continues to accelerate and machines begin taking care of themselves, fewer people will be necessary. Artificial intelligence… more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:What is the life cycle of a data science project ?
A: 1. Data acquisition
Acquiring data from both internal and external sources, including social media or web scraping. In a steady state, data extraction and routines should be in place, and new sources, once identified would be acquired following the established processes

2. Data preparation
Also called data wrangling: cleaning the data and shaping it into a suitable form for later analyses. Involves exploratory data analysis and feature extraction.

3. Hypothesis & modelling
Like in data mining but not with samples, with all the data instead. Applying machine learning techniques to all the data. A key sub-step: model selection. This involves preparing a training set for model candidates, and validation and test sets for comparing model performances, selecting the best performing model, gauging model accuracy and preventing overfitting

4. Evaluation & interpretation

Steps 2 to 4 are repeated a number of times as needed; as the understanding of data and business becomes clearer and results from initial models and hypotheses are evaluated, further tweaks are performed. These may sometimes include step5 and be performed in a pre-production.

5. Deployment

6. Operations
Regular maintenance and operations. Includes performance tests to measure model performance, and can alert when performance goes beyond a certain acceptable threshold

7. Optimization
Can be triggered by failing performance, or due to the need to add new data sources and retraining the model or even to deploy new versions of an improved model

Note: with increasing maturity and well-defined project goals, pre-defined performance can help evaluate feasibility of the data science project early enough in the data-science life cycle. This early comparison helps the team refine hypothesis, discard the project if non-viable, change approaches.

Steps 2 to 4 are repeated a number of times as needed; as the understanding of data and business becomes clearer and results from initial models and hypotheses are evaluated, further tweaks are performed. These may sometimes include step5 and be performed in a pre-production.

Deployment

Operations
Regular maintenance and operations. Includes performance tests to measure model performance, and can alert when performance goes beyond a certain acceptable threshold

Optimization
Can be triggered by failing performance, or due to the need to add new data sources and retraining the model or even to deploy new versions of an improved model

Note: with increasing maturity and well-defined project goals, pre-defined performance can help evaluate feasibility of the data science project early enough in the data-science life cycle. This early comparison helps the team refine hypothesis, discard the project if non-viable, change approaches.

Steps 2 to 4 are repeated a number of times as needed; as the understanding of data and business becomes clearer and results from initial models and hypotheses are evaluated, further tweaks are performed. These may sometimes include step5 and be performed in a pre-production.

Deployment

Operations
Regular maintenance and operations. Includes performance tests to measure model performance, and can alert when performance goes beyond a certain acceptable threshold

Optimization
Can be triggered by failing performance, or due to the need to add new data sources and retraining the model or even to deploy new versions of an improved model

Note: with increasing maturity and well-defined project goals, pre-defined performance can help evaluate feasibility of the data science project early enough in the data-science life cycle. This early comparison helps the team refine hypothesis, discard the project if non-viable, change approaches.

Steps 2 to 4 are repeated a number of times as needed; as the understanding of data and business becomes clearer and results from initial models and hypotheses are evaluated, further tweaks are performed. These may sometimes include step5 and be performed in a pre-production.

Deployment

Operations
Regular maintenance and operations. Includes performance tests to measure model performance, and can alert when performance goes beyond a certain acceptable threshold

Optimization
Can be triggered by failing performance, or due to the need to add new data sources and retraining the model or even to deploy new versions of an improved model

Note: with increasing maturity and well-defined project goals, pre-defined performance can help evaluate feasibility of the data science project early enough in the data-science life cycle. This early comparison helps the team refine hypothesis, discard the project if non-viable, change approaches.

Source

[ VIDEO OF THE WEEK]

#FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

 #FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

We chose it because we deal with huge amounts of data. Besides, it sounds really cool. – Larry Page

[ PODCAST OF THE WEEK]

@JohnTLangton from @Wolters_Kluwer discussed his #AI Lead Startup Journey #FutureOfData #Podcast

 @JohnTLangton from @Wolters_Kluwer discussed his #AI Lead Startup Journey #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Every second we create new data. For example, we perform 40,000 search queries every second (on Google alone), which makes it 3.5 searches per day and 1.2 trillion searches per year.In Aug 2015, over 1 billion people used Facebook FB +0.54% in a single day.

Sourced from: Analytics.CLUB #WEB Newsletter

Up Your Game With Interactive Data Visualizations

In an increasingly data-driven world, the way we visualize the massive amounts of information is a key concern. While static data visualizations have their uses, today’s data visualization tools offer a significant upgrade thanks to their customizability as well as their capacity to generate fully interactive dashboards.

Interactive data visualizations grant you several benefits, ranging from the purely aesthetic to the more practical. They make dashboards engaging, accessible, and turn massive data sets into easily interpretable visual tools. With such a powerful tool, it’s also important to understand the best ways to maximize its potential and derive actionable insights. Incorporating these tips and strategies can help you take your interactive visualizations to the next level.

Getting the most out of your interactive data visualization

An interactive data visualization is a tool used to visually express a set of data in an easy-to-interpret manner. Interactive data visualizations let users manipulate data sets to help them discover better insights. A well-designed interactive visualization adds significant value by displaying information in a new light while empowering you to explore data sets easily and effectively. Using some of these basic strategies can help you produce the best possible interactive data visualizations:

  • Consider the end-user for your visualization – Data visualizations in dashboards are tailored for specific audiences—managers, end-users, accountants, HR, and so on. Understanding their specific needs helps uncover the best possible visualization choices as well as the right data and way to express it.
  • Think about the story you’re telling – The data you apply will largely dictate the type of visualization you can use, and the story that data tells will be important to generate the right interactive element for it.
  • Keep visualizations simple – Interactive visualizations work best when they are focused, concise, and eliminate unnecessary graphics, text, and other elements that take away from the data itself.
  • Choose visualizations that can be easily updated – Interactive visualizations in dashboards rarely display static data. Choosing an interactive data visualization that can’t easily be updated will be difficult to use more than once and lose its value in a dashboard.

Dashboard Design

What kind of visualizations should you use?

There are several excellent examples of interactive visualizations that can quickly upgrade your dashboards and deliver better insights:

  • Sankey Diagrams – These visualizations are excellent for understanding and mapping the flow of data or objects. Sankey diagrams are most commonly used to measure web traffic, data flow between network nodes, or energy flow and consumption.
  • Tree Ring Diagrams – These visualizations are ideal for illustrating and mapping out hierarchies between nodes and how data interacts in a network.
  • Collapsible Trees – These tools show decisions as branching paths from an initial point, allowing viewers to map out possible outcomes through several iterations.
  • Heat Maps – These types of maps are excellent when you have data based on geographical locations. For example, comparing per capita rates on a city map can be made dynamic by highlighting heat points on the map, or allowing for comparison between two locations.

When should you use interactive data visualizations?

Interactive visualizations are not a silver bullet for dashboards, but there are several situations where they can add significant value to your data analysis.

One common use of interactive visualizations is understanding the flow of visitors through a website. By deploying an interactive data visualization, a company can track each individual’s journey through their website, including how long they spent on a page, when they left, and which pages they visited. Users can also view aggregate data to understand which pages are popular and which are losing the most viewers.

In IT, an interactive visualization could highlight different network configurations, as well as show chokepoints of data and areas where the architecture could be improved. Moreover, a tree ring diagram could visualize the relationship between different parts of the network.

For analysts, collapsible trees could show potential outcomes of different investment or risk decisions. Other interactive visualizations could expedite data comparison and create a more holistic view of different data sets to make them easier to explore. (Check out the interactive example below).


Investment Portfolio- Financial Dashboard


In financial dashboards, interactive visualizations could provide a simpler way to view investments in aggregate and then broken down into specific assets or different categories. Additionally, they could track investments over time and allow for higher specificity.

Creating dynamic dashboards

Interactive visualizations offer an easy way to fashion dashboards that are relevant, useful, and engaging. By adding a dynamic element to your data displays, you can add value to your data, keep users interested, and empower them to discover better insights. In a world that is constantly changing, using visualization tools that can reflect these shifts can help you stay ahead of the crowd.

Dashboard Design

Source: Up Your Game With Interactive Data Visualizations

Nov 28, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Weak data  Source

[ AnalyticsWeek BYTES]

>> Google Offers ‘Preemptible’ Virtual Machines by analyticsweekpick

>> Creating Real-Time Anomaly Detection Pipelines with AWS and Talend Data Streams by analyticsweekpick

>> Feb 22, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

Wanna write? Click Here

[ FEATURED COURSE]

Master Statistics with R

image

In this Specialization, you will learn to analyze and visualize data in R and created reproducible data analysis reports, demonstrate a conceptual understanding of the unified nature of statistical inference, perform fre… more

[ FEATURED READ]

Big Data: A Revolution That Will Transform How We Live, Work, and Think

image

“Illuminating and very timely . . . a fascinating — and sometimes alarming — survey of big data’s growing effect on just about everything: business, government, science and medicine, privacy, and even on the way we think… more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:How do you assess the statistical significance of an insight?
A: * is this insight just observed by chance or is it a real insight?
Statistical significance can be accessed using hypothesis testing:
– Stating a null hypothesis which is usually the opposite of what we wish to test (classifiers A and B perform equivalently, Treatment A is equal of treatment B)
– Then, we choose a suitable statistical test and statistics used to reject the null hypothesis
– Also, we choose a critical region for the statistics to lie in that is extreme enough for the null hypothesis to be rejected (p-value)
– We calculate the observed test statistics from the data and check whether it lies in the critical region

Common tests:
– One sample Z test
– Two-sample Z test
– One sample t-test
– paired t-test
– Two sample pooled equal variances t-test
– Two sample unpooled unequal variances t-test and unequal sample sizes (Welch’s t-test)
– Chi-squared test for variances
– Chi-squared test for goodness of fit
– Anova (for instance: are the two regression models equals? F-test)
– Regression F-test (i.e: is at least one of the predictor useful in predicting the response?)

Source

[ VIDEO OF THE WEEK]

Ashok Srivastava(@aerotrekker @intuit) on Winning the Art of #DataScience #FutureOfData #Podcast

 Ashok Srivastava(@aerotrekker @intuit) on Winning the Art of #DataScience #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data is the new science. Big Data holds the answers. – Pat Gelsinger

[ PODCAST OF THE WEEK]

@AmyGershkoff on building #winning #DataScience #team #FutureOfData #Podcast

 @AmyGershkoff on building #winning #DataScience #team #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Brands and organizations on Facebook receive 34,722 Likes every minute of the day.

Sourced from: Analytics.CLUB #WEB Newsletter

Talend Summer’18 Release: Under the Hood of Talend Cloud

Today on July 19, we released Talend Summer ’18, which is jam-packed with cloud features and capabilities. We know you are going to love the Talend Cloud automated integration pipelines, Okta Single Sign-On, and the enhanced data preparation and data stewardship functions…there is so much to explore!

Taking DevOps to the Next Level with the Launch of Jenkens Maven Plug-in Support

DevOps has become a widely adopted practice that streamlines and automates the processes between Development (Dev) and IT operations (Ops), so that they can design, build, test, and deliver software in a moreagile, frictionless, andreliable fashion. However, the conventional challenge is that when it comes to DevOps, customers are not only tasked with finding the right people and culture, but also the right technology.

Data Integration fits into DevOps when it comes to building continuous data integration flows, as well as governing apps to support seamless data flows between apps and data stores. Selecting an integration tool that automates the process is critical. It will not only allow for more frequent deploying and testing of integration flowsagainst different environments, increase code quality, reduce downtime, but also free up DevOps team’s time to work on new codes.

Talend Cloud has transformed the way developers and ops teams collaborate to release software in the past few years. With the launch of Winter ’17, Talend Cloud accelerated the continuous delivery of integration projects by allowing teams to create, promote, and publish jobs in separate production environments. An increasing number of customers recognize the value that Talend Cloud brings for implementing DevOps practice. And now they can use the Talend Cloud Jenkins Maven plug-in in this Summer ’18 release, a feature that lets you automate and orchestrate the full integration process by building, testing, and pushing jobs to all Talend Cloud environments. This in turn further boosts the productivity of your DevOps team and reduces time-to-market.

Security and Compliance made Simple: Enterprise Identity and Access Management (IAM) with 1 Click

 If you are an enterprise customer, you are likely faced with the growing demands of managing thousands of users and partners who need access to your cloud applications, at any time and from any devices. This adds to the complexity of Enterprise Identity and Access Management (IAM) requirement: meeting security and compliance regulations and audit policies, minimizing IT tickets, and only giving the right users access to the right apps. Single Sign-On (SSO) feature helps address this challenge.

In the Summer ’18 release, Talend Cloud introduced the Okta Single Sign-On (SSO) support. SSO permits a user to use one set of company login credentials to access multiple applications at once. This update ensures greater compliance with your company security and audit policies as well as improve user convenience. If you are with other identity management providers, you can simply download a plug-in to leverage this SSO feature. 

The other security and compliance features worth mentioning in this release are the Hadoop User Impersonation for Jobs for the cloud integration app, and the feature that enables fine-grained permissions on the sematic types definition, both will provide greater data and user visibility for better compliance and audit, see this release note for details.

Better Data Governance at Your Finger Tips: New Features in Talend Data Preparation and Data Stewardship Cloud Apps

The Summer ’18 release introduces several new data preparation and data stewardship functions. These include:

  • More data privacy and encryption functions with the new “hash data” function.
  • Finer grained access control in the dictionary service for managing and accessing the semantic types.
  • Improved management in Data Stewardship, now that you can perform mass import, export and remove actions on your data models and campaigns, allowing you to promote, back up or reset your entire environment configuration in just two clicks.
  • Enhancements in the Salesforce.com connectivity that allows you to filter the data in the source module, by defining a condition directly in your Salesforce dataset and focus on the data you need. This reduces the amount of data to be extracted and processed. Making the use case of self-service cleansing and preparation of Salesforce.com data even more compelling.

Those functionalities make cloud data governance a lot simpler and easier.

To learn more, please visit Talend Cloud product pageor sign up for a Talend Cloud 30-day free trial.

For more exciting updates, you can pre-register for Talend Connect 2019.

The post Talend Summer’18 Release: Under the Hood of Talend Cloud appeared first on Talend Real-Time Open Source Data Integration Software.

Source: Talend Summer’18 Release: Under the Hood of Talend Cloud by analyticsweekpick

Mitigating the Threat of Hackers to Your Supply Chain

The rate of digital disruption has recently skyrocketed across every industry, helping accelerate global expansion and automating mundane, menial tasks. Coupled with this is the fact that as businesses grow globally, we have seen an alarming uptick in cybercrime. Because organizations have become increasingly digitalized, they are opening themselves up to threatening landscapes where their […]

The post Mitigating the Threat of Hackers to Your Supply Chain appeared first on TechSpective.

Source: Mitigating the Threat of Hackers to Your Supply Chain by administrator

Nov 21, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data analyst  Source

[ AnalyticsWeek BYTES]

>> Getting to Love: Customer Word Clouds by bobehayes

>> Automating Invoice Processing with OCR and Deep Learning by administrator

>> Making Sense of BI Software Reviews by analyticsweek

Wanna write? Click Here

[ FEATURED COURSE]

The Analytics Edge

image

This is an Archived Course
EdX keeps courses open for enrollment after they end to allow learners to explore content and continue learning. All features and materials may not be available, and course content will not be… more

[ FEATURED READ]

Data Science from Scratch: First Principles with Python

image

Data science libraries, frameworks, modules, and toolkits are great for doing data science, but they’re also a good way to dive into the discipline without actually understanding data science. In this book, you’ll learn … more

[ TIPS & TRICKS OF THE WEEK]

Finding a success in your data science ? Find a mentor
Yes, most of us dont feel a need but most of us really could use one. As most of data science professionals work in their own isolations, getting an unbiased perspective is not easy. Many times, it is also not easy to understand how the data science progression is going to be. Getting a network of mentors address these issues easily, it gives data professionals an outside perspective and unbiased ally. It’s extremely important for successful data science professionals to build a mentor network and use it through their success.

[ DATA SCIENCE Q&A]

Q:What are the drawbacks of linear model? Are you familiar with alternatives (Lasso, ridge regression)?
A: * Assumption of linearity of the errors
* Can’t be used for count outcomes, binary outcomes
* Can’t vary model flexibility: overfitting problems
* Alternatives: see question 4 about regularization

Source

[ VIDEO OF THE WEEK]

Ashok Srivastava(@aerotrekker @intuit) on Winning the Art of #DataScience #FutureOfData #Podcast

 Ashok Srivastava(@aerotrekker @intuit) on Winning the Art of #DataScience #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

I’m sure, the highest capacity of storage device, will not enough to record all our stories; because, everytime with you is very valuable da

[ PODCAST OF THE WEEK]

Understanding #FutureOfData in #Health & #Medicine - @thedataguru / @InovaHealth #FutureOfData #Podcast

 Understanding #FutureOfData in #Health & #Medicine – @thedataguru / @InovaHealth #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Decoding the human genome originally took 10 years to process; now it can be achieved in one week.

Sourced from: Analytics.CLUB #WEB Newsletter

2018 Data and Visualization Gift Ideas

We’re continuing our tradition of the annual data gift guide. These are some of our favorite books and gift ideas for the data scientist, designer or analyst in your life.

While you’re here take a look at the Juicebox product page to see what it looks like unwrapped.

Happy Holidays!

Screen Shot 2018-11-20 at 19.53.04.png

New Books We Love

Books we read in 2018

Data Fluency Image.jpg

Classic Data Books

We’re a little biased in this category, but these are the books on our desks that we refer to all the time.

Data Fluency – Thinking about changing how your team or organization works with data?This is the book for you.

Storytelling with Data – This one already feels like a classic. It provides simple, clear guidance on chart usage and storytelling. Hard not to reference it in the midst of a project.

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy – This is the book that keeps us grounded. Despite how much we think data is delicious and fun its serious too.

The Man Who Lied to His Laptop: What We Can Learn About Ourselves from Our Machines – A seminal read on learning about interactions between humans and machines.

Visualize This: The FlowingData Guide to Design, Visualization, and Statistics – Nathan Yau’s book that teaches us something new every time we pick it up.

The Truthful Art: Data, Charts, and Maps for Communication – We love all of Alberto’s books, but this one is our favorite. Wonderful examples throughout the book.

Screen Shot 2018-11-20 at 17.16.50.png

Art & Posters

Infographics, Maps, Data Art & More

Data Viz Game.jpg

Data Nerds

This is a term of affection during the holidays.

Originally Posted at: 2018 Data and Visualization Gift Ideas

5 Steps to Transform HR with Predictive Talent Analytics

talent
Every story has a beginning, and the story of Human Resources began in the 1950s (remember Personnel?). A lot has changed since then: the technology boom, four workforce generations, drastic changes in world economics and in the way we all work. Because HR is still so young in comparison to other mainstay professions like law or medicine, it is still facing its first round of major changes.

Big data, performance standards, talent analytics, talent management, performance and employee data, these have all had a major effect on how HR functions since these data points didn’t even exist at the beginning of the HR story. While predictive analytics is new territory for many HR pros, there are some best practices around predictive analytics to help transform your HR department into a more agile and proactive organizational entity.

1. Assemble Your A-Team
First things first… You have to decide who is going to be on your team. Who has the skills and the knowledge needed? Which employees have the training to interpret the analytics? More and more companies have begun to hire specialized data scientists who have the skills and training to not only interpret the data but translate it as well. Travis Wright (@teedubya), a Chief Marketing Technologist, said:

“These specialists are a crucial part of ‘competitive intelligence,’ which is a new and quickly growing industry. The actual job description can vary from company to company, but the most common task is mining data (of course). ‘Big data’ was a big buzzword in 2014, but it will always remain a vital part of any company. Data is useless if it’s not ‘mined,’ which means optimally collected, analyzed, organized, and activated.”

Unfortunately, many companies cannot afford their own data scientist. Look for those who know their way in and around the HRIS or ATS and those with project management experience (since this will be a longer project than even self-professed “data geeks” will have the patience for).

2. Find Your Square One
Every stepping stone needs a benchmark. To move forward in the transformation, you have to understand your background. Likewise, talent analytics can’t become “predictive” without first assessing the current situation of your pipelines. Simply stated, HR leaders need to understand immediate needs to determine future talent needs. For the HR Professional, this means taking the time to make a map of the processes your department or team repeats over and over and then pinpointing efficiencies (or the lack thereof) in both process and tools.

3. Identify Valuable Numbers
There’s a lot of data out there for your business to look at. Just in an ATS alone, there are thousands of records, each with multiple data points. But you can’t analyze every piece of data that comes across the table or assign it the same level of importance. That’s why it is necessary to filter and funnel the kind of data you will evaluate. Key metrics should primarily involve internal data, but only the data that is truly relevant to your talent analytics. However, external data can be valuable when benchmarking within industries, creating compensation models and figuring supply and demand, so don’t throw it on the trash heap just yet. The primary issue for executives, however, is that only 27% feel they have the expertise necessary for talent analytics and only 13% have the systems to do so.

4. Sharing is Caring
Who needs to know the findings? Who needs to know the information you’ve just assembled? Stakeholders who have the power to make decisions based on this information rely on you to share the information so they can collaborate and make those decisions. That’s why these analytics need to be filtered, so they can be translated into actionable tasks. Beware sharing a messy pile of data points; instead, decide with your team what issues you need to identify and solve for X. Then deliver key insights that can translate into guiding parameters for your company.

5. Training, Training, Training
Big data has become prevalent in decision making. Talent analytics, as a facet of big data, is a predominant resource for HR professionals. They need to understand how data, statistics and analytics can benefit them in the hiring process and employee development processes. Offer your team training opportunities so they can develop their skills and become comfortable with that data. While there is a dearth of data scientists out there, analytics tend to be quite personal to the company, so train in-house and use your analytics vendors for additional learning.

Although HR is a rather young entity and still has a lot to learn, predictive talent analytics is the first major change specific to HR. This step creates an opportunity for organizations to become proactive versus reactive in their decision making. When you assemble the right personnel and define the starting point from which to benchmark, you can begin to share the data with stakeholders and train your HR professionals to analyze the data. These 5 steps can help company executives make the move toward data consumption as a way to influence their decisions based on a higher level of insight.

– See more at: http://blogs.infor.com/infor-hcm/2015/05/5-steps-to-transform-hr-with-predictive-talent-analytics.html#sthash.ymHbUAnw.dpuf

Originally Posted at: 5 Steps to Transform HR with Predictive Talent Analytics by analyticsweekpick

Nov 14, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data Storage  Source

[ AnalyticsWeek BYTES]

>> Three Types Of Context To Make Your Audience Care About Your Data by analyticsweek

>> Surge in real-time big data and IoT analytics is changing corporate thinking by analyticsweekpick

>> 7 Steps to Start Your Predictive Analytics Journey by analyticsweek

Wanna write? Click Here

[ FEATURED COURSE]

Intro to Machine Learning

image

Machine Learning is a first-class ticket to the most exciting careers in data analysis today. As data sources proliferate along with the computing power to process them, going straight to the data is one of the most stra… more

[ FEATURED READ]

Rise of the Robots: Technology and the Threat of a Jobless Future

image

What are the jobs of the future? How many will there be? And who will have them? As technology continues to accelerate and machines begin taking care of themselves, fewer people will be necessary. Artificial intelligence… more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:You are compiling a report for user content uploaded every month and notice a spike in uploads in October. In particular, a spike in picture uploads. What might you think is the cause of this, and how would you test it?
A: * Halloween pictures?
* Look at uploads in countries that don’t observe Halloween as a sort of counter-factual analysis
* Compare uploads mean in October and uploads means with September: hypothesis testing

Source

[ VIDEO OF THE WEEK]

#FutureOfData with Rob(@telerob) / @ConnellyAgency on running innovation in agency

 #FutureOfData with Rob(@telerob) / @ConnellyAgency on running innovation in agency

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

With data collection, ‘the sooner the better’ is always the best answer. – Marissa Mayer

[ PODCAST OF THE WEEK]

Ashok Srivastava(@aerotrekker @intuit) on Winning the Art of #DataScience #FutureOfData #Podcast

 Ashok Srivastava(@aerotrekker @intuit) on Winning the Art of #DataScience #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

In late 2011, IDC Digital Universe published a report indicating that some 1.8 zettabytes of data will be created that year.

Sourced from: Analytics.CLUB #WEB Newsletter