Jul 27, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Data security  Source

[ AnalyticsWeek BYTES]

>> May 1, 2017 Health and Biotech analytics news roundup by pstein

>> Navigating Big Data Careers with a Statistics PhD by anum

>> What Big Data Analytics Professionals Want From IT by analyticsweekpick

Wanna write? Click Here

[ NEWS BYTES]

>>
 Apple To Open Data Center In China With Government Ties – Manufacturing.net Under  Data Center

>>
 Internet of things sensors could connect via ambient radio waves … – ScienceBlog.com (blog) Under  Internet Of Things

>>
 Machine Learning Education: 3 Paths to Get Started – Datanami Under  Machine Learning

More NEWS ? Click Here

[ FEATURED COURSE]

Master Statistics with R

image

In this Specialization, you will learn to analyze and visualize data in R and created reproducible data analysis reports, demonstrate a conceptual understanding of the unified nature of statistical inference, perform fre… more

[ FEATURED READ]

Superintelligence: Paths, Dangers, Strategies

image

The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but … more

[ TIPS & TRICKS OF THE WEEK]

Data aids, not replace judgement
Data is a tool and means to help build a consensus to facilitate human decision-making but not replace it. Analysis converts data into information, information via context leads to insight. Insights lead to decision making which ultimately leads to outcomes that brings value. So, data is just the start, context and intuition plays a role.

[ DATA SCIENCE Q&A]

Q:Why is naive Bayes so bad? How would you improve a spam detection algorithm that uses naive Bayes?
A: Naïve: the features are assumed independent/uncorrelated
Assumption not feasible in many cases
Improvement: decorrelate features (covariance matrix into identity matrix)

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

If we have data, let’s look at data. If all we have are opinions, let’s go with mine. – Jim Barksdale

[ PODCAST OF THE WEEK]

#FutureOfData Podcast: Conversation With Sean Naismith, Enova Decisions

 #FutureOfData Podcast: Conversation With Sean Naismith, Enova Decisions

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Brands and organizations on Facebook receive 34,722 Likes every minute of the day.

Sourced from: Analytics.CLUB #WEB Newsletter

7 Things to Look before Picking Your Data Discovery Vendor

7 Things to Look before Picking Your Data Discovery Vendor
7 Things to Look before Picking Your Data Discovery Vendor

Data Discovery Tools: also called Data Visualization Tool, sometimes also referred to as Data Analytics tools. These tools are talk of the town and reasonably hot today. With all the hype about Big-data, companies are going dandy on planning for their big-data and are on a lookout for great tools to recruit in their big-data strategy. One thing to note here is that we don’t change our data discovery tool vendors every day. Once we get into our system, it will eventually become part of our Big-data DNA Solution. So, we should put much thought into what goes in picking data discovery/visualization/analysis tool.

So, what would you do and what would you consider important while picking up your data discovery tool vendor? I interviewed a couple of data scientists and data managers in a bunch of companies and prioritized the findings with top 7 things to consider before you go out picking your big-data discovery tools vendor.

Here are my 7 thoughts not particularly in that order:

1. Not a major jump with what I already have: Yes, learning a new system takes time, effort, resources and cycles. So, the faster the ramp up or shorter the learning curve, the better it will be. Sure, many tools will be eons apart with what you are used to, but that should not deter you from evaluating it as well. Just go a bit high on the tools with minimum learning curve. One thing to check here is that you should be able to do routine things with this new tool, almost the same way as you used to doing without it.

2. Helps me to do more with my data: There will be several moments where you will realize the tools could do a lot more than what you are used to or what you are capable of. This is another check to include in your equation. More feature set, capabilities within the discovery tool. The more it will let you do stuff with your data, the better it will be. You should be able to do more fun and investigative stuffs with your data more closely and at various dimensions that it will ultimately help with better understanding of the data.

3. Integrate well with my big-data: Yes, first things first, you need a tool that at least has the capability to talk to your data. It should be able to mingle well with your data layouts, structures without having to do too much time consuming steps. A good tool will always make it almost seamless to integrate your data. If you have to jump ropes, cut corners to make data integration happen, maybe you are looking at a wrong tool for help. So, get your data integration team to work and make sure data integration is a no issue with the tool that you are evaluating to buy.

4. Friendly with outside data I might include as well: Many-times, it is not only about your data. Sometime you need to access and evaluate external data and find their relationship with your data. Those used case must be checked as well. How easy it is to include external structured and unstructured data. The bigger the product integration roadmap for the vendor, the easier it will be for the tool to connect with other external resources. Your preferred tools should be able to integrate seamlessly with data sets involved in your industry. Social data, industry data, other third party application data are some example. So, ask your vendor on how their tools mingle well with other outside data sets.

5. Scalability of the platform: Sure, the tool you are evaluating could do wonders with data and has a sweet feature set, but will it scale well as you grow. This is important consideration just like any good corporate tool considerations. If your business will grow, so will it’s data and associated dependencies, but will your discovery tool grow with it? This is one finding which must be part of your evaluation score for any tool you are planning to recruit for your big-data discovery need. So, get on call with technical teams from vendor and grill them to understand how their tool will grow with growing data. You don’t want to partner with a tool that will break in future as your business grows.

6. Vendor’s vision is in-line with our vision: Above 5 measures are pretty much standard and defines basic functionalities on what a good tool should entail. It’s also not a big surprise that most of the tools will have some or the other of their own interpretation of the above 5 points. Now one key thing to notice on strategic front will be their vision for the company and the tool. Tool can do you good today, it has boatload of features, it is friendly with your and outside data. But will it grow with a strategy consistent with yours. Yes, no matter how weird it sounds, it is one of the realities that you should consider. A vendor only handling health care will have some impact to companies using the tools for insurance sector. A tool that will handle only clever visualization piece might have impact on companies expecting some automation as part of the core tool evolution. So, it is important to understand the product vision of the tool company, that will help you understand if it will comply with your business value tomorrow or day-after or in foreseeable future.

7. Awesome import / export tools to keep my data/analysis free: Another important thing to note is stickiness with the products. A good product design should not keep customer sticky by keeping their data hostage. A good tool should bank on it’s features, usability and data driven design. So, data and it’s knowledge should be easily importable/exportable to most common standards (csv, xml etc.). This will keep the tool up with integrating it with other third party service that might emerge with emerging market. This should be a consideration as it will play an instrumental role in moving your data around as you start dealing with new formats and new reporting tools that are leveraging your data discovery findings.

I am certain by the end of 7 steps you must have thought about several more examples that one could keep in mind before picking a good data discovery tool. Feel free to email me your findings and I will keep adding it to the list.

Source

April 10, 2017 Health and Biotech analytics news roundup

A DNA-testing company is offering patients a chance to help find a cure for their conditions: Invitae is launching the Patient Insights Network, where people can input their own genome data and help link it to other health data.

Congratulations, you’re a parasite!  Erick Turner and Kun-Hsing Yu won the first ‘Research Parasite’ award, given to highlight reanalysis of data. The name is a tongue-in-cheek reference to an infamous article decrying the practice.

IMI chief: ‘We need to learn how to share data in a safe and ethical manner’: Pierre Meulien discusses the EU’s Innovative Medicines Initiative, where public and private institutions collaborate.

5 Tips for Making Use of Big Data in Healthcare Production: Two pharmaceutical executives offer their opinions on using data in pharmaceutical manufacturing.

Originally Posted at: April 10, 2017 Health and Biotech analytics news roundup by pstein

Jul 20, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Pacman  Source

[ AnalyticsWeek BYTES]

>> See what you never expected with data visualization by analyticsweekpick

>> With big data invading campus, universities risk unfairly profiling their students by anum

>> Health analytics: The treatment of choice for efficient, cost-effective care by anum

Wanna write? Click Here

[ NEWS BYTES]

>>
 Big Data in Business Analytics – The CPA Journal Under  Business Analytics

>>
 Girl Scouts to Earn Cyber Security Badges – My Champlain Valley FOX44 & ABC22 Under  cyber security

>>
 Second Denver metro car wash targeted for data security breach – The Denver Post Under  Data Security

More NEWS ? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

Thinking, Fast and Slow

image

Drawing on decades of research in psychology that resulted in a Nobel Prize in Economic Sciences, Daniel Kahneman takes readers on an exploration of what influences thought example by example, sometimes with unlikely wor… more

[ TIPS & TRICKS OF THE WEEK]

Grow at the speed of collaboration
A research by Cornerstone On Demand pointed out the need for better collaboration within workforce, and data analytics domain is no different. A rapidly changing and growing industry like data analytics is very difficult to catchup by isolated workforce. A good collaborative work-environment facilitate better flow of ideas, improved team dynamics, rapid learning, and increasing ability to cut through the noise. So, embrace collaborative team dynamics.

[ DATA SCIENCE Q&A]

Q:Do you think 50 small decision trees are better than a large one? Why?
A: * Yes!
* More robust model (ensemble of weak learners that come and make a strong learner)
* Better to improve a model by taking many small steps than fewer large steps
* If one tree is erroneous, it can be auto-corrected by the following
* Less prone to overfitting

Source

[ VIDEO OF THE WEEK]

#GlobalBusiness at the speed of The #BigAnalytics

 #GlobalBusiness at the speed of The #BigAnalytics

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data beats emotions. – Sean Rad, founder of Ad.ly

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

According to estimates, the volume of business data worldwide, across all companies, doubles every 1.2 years.

Sourced from: Analytics.CLUB #WEB Newsletter

The 37 best tools for data visualization

Creating charts and info graphics can be time-consuming. But these tools make it easier.

It’s often said that data is the new world currency, and the web is the exchange bureau through which it’s traded. As consumers, we’re positively swimming in data; it’s everywhere from labels on food packaging design to World Health Organisation reports. As a result, for the designer it’s becoming increasingly difficult to present data in a way that stands out from the mass of competing data streams.

One of the best ways to get your message across is to use a visualization to quickly draw attention to the key messages, and by presenting data visually it’s also possible to uncover surprising patterns and observations that wouldn’t be apparent from looking at stats alone.

EVENT PROMOTION

Not a web designer or developer? You may prefer free tools for creating infographics.

As author, data journalist and information designer David McCandless said in his TED talk: “By visualizing information, we turn it into a landscape that you can explore with your eyes, a sort of information map. And when you’re lost in information, an information map is kind of useful.”

There are many different ways of telling a story, but everything starts with an idea. So to help you get started we’ve rounded up some of the most awesome data visualization tools available on the web.

01. Dygraphs

Help visitors explore dense data sets with JavaScript library Dygraphs

Dygraphs is a fast, flexible open source JavaScript charting library that allows users to explore and interpret dense data sets. It’s highly customizable, works in all major browsers, and you can even pinch to zoom on mobile and tablet devices.

02. ZingChart

ZingChart lets you create HTML5 Canvas charts and more

ZingChart is a JavaScript charting library and feature-rich API set that lets you build interactive Flash or HTML5 charts. It offer over 100 chart types to fit your data.

03. InstantAtlas

InstantAtlas enables you to create highly engaging visualisations around map data

If you’re looking for a data viz tool with mapping, InstantAtlas is worth checking out. This tool enables you to create highly-interactive dynamic and profile reports that combine statistics and map data to create engaging data visualizations.

04. Timeline

 Timeline
Timeline creates beautiful interactive visualizations

Timeline is a fantastic widget which renders a beautiful interactive timeline that responds to the user’s mouse, making it easy to create advanced timelines that convey a lot of information in a compressed space.

Each element can be clicked to reveal more in-depth information, making this a great way to give a big-picture view while still providing full detail.

05. Exhibit

 Exhibit
Exhibit makes data visualization a doddle

Developed by MIT, and fully open-source, Exhibit makes it easy to create interactive maps, and other data-based visualizations that are orientated towards teaching or static/historical based data sets, such as flags pinned to countries, or birth-places of famous people.

06. Modest Maps

 Modest Maps
Integrate and develop interactive maps within your site with this cool tool

Modest Maps is a lightweight, simple mapping tool for web designers that makes it easy to integrate and develop interactive maps within your site, using them as a data visualization tool.

The API is easy to get to grips with, and offers a useful number of hooks for adding your own interaction code, making it a good choice for designers looking to fully customise their user’s experience to match their website or web app. The basic library can also be extended with additional plugins, adding to its core functionality and offering some very useful data integration options.

07. Leaflet

 Leaflet
Use OpenStreetMap data and integrate data visualisation in an HTML5/CSS3 wrapper

Another mapping tool, Leaflet makes it easy to use OpenStreetMap data and integrate fully interactive data visualisation in an HTML5/CSS3 wrapper.

The core library itself is very small, but there are a wide range of plugins available that extend the functionality with specialist functionality such as animated markers, masks and heatmaps. Perfect for any project where you need to show data overlaid on a geographical projection (including unusual projections!).

08. WolframAlpha

 Wolfram Alpha
Wolfram Alpha is excellent at creating charts

Billed as a “computational knowledge engine”, the Google rival WolframAlpha is really good at intelligently displaying charts in response to data queries without the need for any configuration. If you’re using publically available data, this offers a simple widget builder to make it really simple to get visualizations on your site.

09. Visual.ly

 Visual.ly
Visual.ly makes data visualization as simple as it can be

Visual.ly is a combined gallery and infographic generation tool. It offers a simple toolset for building stunning data representations, as well as a platform to share your creations. This goes beyond pure data visualisation, but if you want to create something that stands on its own, it’s a fantastic resource and an info-junkie’s dream come true!

10. Visualize Free

 Visualize Free
Make visualizations for free!

Visualize Free is a hosted tool that allows you to use publicly available datasets, or upload your own, and build interactive visualizations to illustrate the data. The visualizations go well beyond simple charts, and the service is completely free plus while development work requires Flash, output can be done through HTML5.

11. Better World Flux

 Better World Flux
Making the ugly beautiful – that’s Better World Flux

Orientated towards making positive change to the world, Better World Flux has some lovely visualizations of some pretty depressing data. It would be very useful, for example, if you were writing an article about world poverty, child undernourishment or access to clean water. This tool doesn’t allow you to upload your own data, but does offer a rich interactive output.

12. FusionCharts

FusionCharts Suite XT
A comprehensive JavaScript/HTML5 charting solution for your data visualization needs

FusionCharts Suite XT brings you 90+ charts and gauges, 965 data-driven maps, and ready-made business dashboards and demos. FusionCharts comes with extensive JavaScript API that makes it easy to integrate it with any AJAX application or JavaScript framework. These charts, maps and dashboards are highly interactive, customizable and work across all devices and platforms. They also have a comparison of the top JavaScript charting libraries which is worth checking out.

13. jqPlot

 jQPlot
jqPlot is a nice solution for line and point charts

Another jQuery plugin, jqPlot is a nice solution for line and point charts. It comes with a few nice additional features such as the ability to generate trend lines automatically, and interactive points that can be adjusted by the website visitor, updating the dataset accordingly.

14. Dipity

 Dipity
Dipity has free and premium versions to suit your needs

Dipity allows you to create rich interactive timelines and embed them on your website. It offers a free version and a premium product, with the usual restrictions and limitations present. The timelines it outputs are beautiful and fully customisable, and are very easy to embed directly into your page.

15. Many Eyes

 Many Eyes
Many Eyes was developed by IBM

Developed by IBM, Many Eyes allows you to quickly build visualizations from publically available or uploaded data sets, and features a wide range of analysis types including the ability to scan text for keyword density and saturation. This is another great example of a big company supporting research and sharing the results openly.

16. D3.js

 D3.js
You can render some amazing diagrams with D3

D3.js is a JavaScript library that uses HTML, SVG, and CSS to render some amazing diagrams and charts from a variety of data sources. This library, more than most, is capable of some seriously advanced visualizations with complex data sets. It’s open source, and uses web standards so is very accessible. It also includes some fantastic user interaction support.

17. JavaScript InfoVis Toolkit

 JavaScript InfoVis Toolkit
JavaScript InfoVis Toolkit includes a handy modular structure

A fantastic library written by Nicolas Belmonte, the JavaScript InfoVis Toolkit includes a modular structure, allowing you to only force visitors to download what’s absolutely necessary to display your chosen data visualizations. This library has a number of unique styles and swish animation effects, and is free to use (although donations are encouraged).

18. jpGraph

 jpGraph
jpGraph is a PHP-based data visualization tool

If you need to generate charts and graphs server-side, jpGraph offers a PHP-based solution with a wide range of chart types. It’s free for non-commercial use, and features extensive documentation. By rendering on the server, this is guaranteed to provide a consistent visual output, albeit at the expense of interactivity and accessibility.

19. Highcharts

 Highcharts
Highcharts has a huge range of options available

Highcharts is a JavaScript charting library with a huge range of chart options available. The output is rendered using SVG in modern browsers and VML in Internet Explorer. The charts are beautifully animated into view automatically, and the framework also supports live data streams. It’s free to download and use non-commercially (and licensable for commercial use). You can also play with the extensive demos using JSFiddle.

20. Google Charts

 Google Charts
Google Charts has an excellent selection of tools available

The seminal charting solution for much of the web, Google Charts is highly flexible and has an excellent set of developer tools behind it. It’s an especially useful tool for specialist visualizations such as geocharts and gauges, and it also includes built-in animation and user interaction controls.

21. Excel

 Excel
It isn’t graphically flexible, but Excel is a good way to explore data: for example, by creating ‘heat maps’ like this one

You can actually do some pretty complex things with Excel, from ‘heat maps’ of cells to scatter plots. As an entry-level tool, it can be a good way of quickly exploring data, or creating visualizations for internal use, but the limited default set of colours, lines and styles make it difficult to create graphics that would be usable in a professional publication or website. Nevertheless, as a means of rapidly communicating ideas, Excel should be part of your toolbox.

Excel comes as part of the commercial Microsoft Office suite, so if you don’t have access to it, Google’s spreadsheets – part ofGoogle Docs and Google Drive – can do many of the same things. Google ‘eats its own dog food’, so the spreadsheet can generate the same charts as the Google Chart API. This will get your familiar with what is possible before stepping off and using the API directly for your own projects.

22. CSV/JSON

CSV (Comma-Separated Values) and JSON (JavaScript Object Notation) aren’t actual visualization tools, but they are common formats for data. You’ll need to understand their structures and how to get data in or out of them.

23. Crossfilter

 Crossfilter
Crossfilter in action: by restricting the input range on any one chart, data is affected everywhere. This is a great tool for dashboards or other interactive tools with large volumes of data behind them

As we build more complex tools to enable clients to wade through their data, we are starting to create graphs and charts that double as interactive GUI widgets. JavaScript library Crossfilter can be both of these. It displays data, but at the same time, you can restrict the range of that data and see other linked charts react.

24. Tangle

 Tangle
Tangle creates complex interactive graphics. Pulling on any one of the knobs affects data throughout all of the linked charts. This creates a real-time feedback loop, enabling you to understand complex equations in a more intuitive way

The line between content and control blurs even further with Tangle. When you are trying to describe a complex interaction or equation, letting the reader tweak the input values and see the outcome for themselves provides both a sense of control and a powerful way to explore data. JavaScript library Tangle is a set of tools to do just this.

Dragging on variables enables you to increase or decrease their values and see an accompanying chart update automatically. The results are only just short of magical.

25. Polymaps

 Polymaps
Aimed more at specialist data visualisers, the Polymaps library creates image and vector-tiled maps using SVG

Polymaps is a mapping library that is aimed squarely at a data visualization audience. Offering a unique approach to styling the the maps it creates, analagous to CSS selectors, it’s a great resource to know about.

26. OpenLayers

 OpenLayers
It isn’t easy to master, but OpenLayers is arguably the most complete, robust mapping solution discussed here

OpenLayers is probably the most robust of these mapping libraries. The documentation isn’t great and the learning curve is steep, but for certain tasks nothing else can compete. When you need a very specific tool no other library provides, OpenLayers is always there.

27. Kartograph

 Kartograph
Kartograph’s projections breathe new life into our standard slippy maps

Kartograph’s tag line is ‘rethink mapping’ and that is exactly what its developers are doing. We’re all used to the Mercator projection, but Kartograph brings far more choices to the table. If you aren’t working with worldwide data, and can place your map in a defined box, Kartograph has the options you need to stand out from the crowd.

28. CartoDB

 CartoDB
CartoDB provides an unparalleled way to combine maps and tabular data to create visualisations

CartoDB is a must-know site. The ease with which you can combine tabular data with maps is second to none. For example, you can feed in a CSV file of address strings and it will convert them to latitudes and longitudes and plot them on a map, but there are many other users. It’s free for up to five tables; after that, there are monthly pricing plans.

29. Processing

 Processing
Processing provides a cross-platform environment for creating images, animations, and interactions

Processing has become the poster child for interactive visualizations. It enables you to write much simpler code which is in turn compiled into Java.

There is also a Processing.js project to make it easier for websites to use Processing without Java applets, plus a port to Objective-C so you can use it on iOS. It is a desktop application, but can be run on all platforms, and given that it is now several years old, there are plenty of examples and code from the community.

30. NodeBox

 NodeBox
NodeBox is a quick, easy way for Python-savvy developers to create 2D visualisations

NodeBox is an OS X application for creating 2D graphics and visualizations. You need to know and understand Python code, but beyond that it’s a quick and easy way to tweak variables and see results instantly. It’s similar to Processing, but without all the interactivity.

31. R

 R
A powerful free software environment for statistical computing and graphics, R is the most complex of the tools listed here

How many other pieces of software have an entire search enginededicated to them? A statistical package used to parse large data sets, R is a very complex tool, and one that takes a while to understand, but has a strong community and package library, with more and more being produced.

The learning curve is one of the steepest of any of these tools listed here, but you must be comfortable using it if you want to get to this level.

32. Weka

 Weka
A collection of machine-learning algorithms for data-mining tasks, Weka is a powerful way to explore data

When you get deeper into being a data scientist, you will need to expand your capabilities from just creating visualizations to data mining. Weka is a good tool for classifying and clustering data based on various attributes – both powerful ways to explore data – but it also has the ability to generate simple plots.

33. Gephi

 Gelphi
Gephi in action. Coloured regions represent clusters of data that the system is guessing are similar

When people talk about relatedness, social graphs and co-relations, they are really talking about how two nodes are related to one another relative to the other nodes in a network. The nodes in question could be people in a company, words in a document or passes in a football game, but the maths is the same.

Gephi, a graph-based visualiser and data explorer, can not only crunch large data sets and produce beautiful visualizations, but also allows you to clean and sort the data. It’s a very niche use case and a complex piece of software, but it puts you ahead of anyone else in the field who doesn’t know about this gem.

34. iCharts

 iCharts
iCharts can have interactive elements, and you can pull in data from Google Docs

The iCharts service provides a hosted solution for creating and presenting compelling charts for inclusion on your website. There are many different chart types available, and each is fully customisable to suit the subject matter and colour scheme of your site.

Charts can have interactive elements, and can pull data from Google Docs, Excel spreadsheets and other sources. The free account lets you create basic charts, while you can pay to upgrade for additional features and branding-free options.

35. Flot

 Flot
Create animated visualisations with this jQuery plugin

Flot is a specialised plotting library for jQuery, but it has many handy features and crucially works across all common browsers including Internet Explorer 6. Data can be animated and, because it’s a jQuery plugin, you can fully control all the aspects of animation, presentation and user interaction. This does mean that you need to be familiar with (and comfortable with) jQuery, but if that’s the case, this makes a great option for including interactive charts on your website.

36. Raphaël

 Raphael
This handy JavaScript library offers a range of data visualisation options

This handy JavaScript library offers a wide range of data visualization options which are rendered using SVG. This makes for a flexible approach that can easily be integrated within your own web site/app code, and is limited only by your own imagination.

That said, it’s a bit more hands-on than some of the other tools featured here (a victim of being so flexible), so unless you’re a hardcore coder, you might want to check out some of the more point-and-click orientated options first!

37. jQuery Visualize

 JQuery Visualise
jQuery Visualize Plugin is an open source charting plugin

Written by the team behind jQuery’s ThemeRoller and jQuery UI websites, jQuery Visualize Plugin is an open source charting plugin for jQuery that uses HTML Canvas to draw a number of different chart types. One of the key features of this plugin is its focus on achieving ARIA support, making it friendly to screen-readers. It’s free to download from this page on GitHub.

Further reading

  • A great Tumblr blog for visualization examples and inspiration:vizualize.tumblr.com
  • Nicholas Felton’s annual reports are now infamous, but he also has a Tumblr blog of great things he finds.
  • From the guy who helped bring Processing into the world:benfry.com/writing
  • Stamen Design is always creating interesting projects:stamen.com
  • Eyeo Festival brings some of the greatest minds in data visualization together in one place, and you can watch the videos online.

Brian Suda is a master informatician and author of Designing with Data, a practical guide to data visualisation.

Originally posted via “The 37 best tools for data visualization”

 

Originally Posted at: The 37 best tools for data visualization

Big Data Introduction to D3 [video]

Big Data Introduction to D3
[/checklist] Big Data Introduction to D3
Synopsis:

D3.js is a Javascript library primarily used to create interactive data visualizations in the browser.  Despite its growing popularity and warm community, getting started with D3 can be tricky.  This talk covers the basics of D3 and sheds light on some of its main conceptual hurdles. It concludes by discussing some applications of D3 to big data.

 

About the speaker:

 

Sam Selikoff [ http://www.samselikoff.com/ | @samselikoff ] is a self-taught full-stack web developer. Formerly a graduate student of economics and finance, he unexpectedly discovered a passion for programming while doing data work for a consulting firm. He is currently focusing on client-side MVC and data visualization.

Here’s the youtube link:

[youtube http://www.youtube.com/watch?v=kFCDA1uzGFo&w=560&h=315]

Here’s the slideshare:

Thanks to our Sponsors
Microsoft for providing awesome venue for the event.

Rovi for providing the food/drinks.

cognizeus for hosting the event and providing books to give away as raffle.

Source: Big Data Introduction to D3 by v1shal

Jul 13, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Complex data  Source

[ AnalyticsWeek BYTES]

>> The intersection of analytics, social media and cricket in the cognitive era of computing by anum

>> Hacking the Data Science by v1shal

>> From Mad Men to Math Men: Is Mass Advertising Really Dead? by tony

Wanna write? Click Here

[ NEWS BYTES]

>>
 Data analytics at Alexandra Health System: A new journey in healthcare – Singapore Business Review Under  Health Analytics

>>
 In-memory software startup Alluxio tosses hat in big data storage ring – TechTarget Under  Big Data

>>
 Microsoft Azure cloud gets OK to handle sensitive Veterans Affairs data – FedScoop Under  Cloud

More NEWS ? Click Here

[ FEATURED COURSE]

Intro to Machine Learning

image

Machine Learning is a first-class ticket to the most exciting careers in data analysis today. As data sources proliferate along with the computing power to process them, going straight to the data is one of the most stra… more

[ FEATURED READ]

How to Create a Mind: The Secret of Human Thought Revealed

image

Ray Kurzweil is arguably today’s most influential—and often controversial—futurist. In How to Create a Mind, Kurzweil presents a provocative exploration of the most important project in human-machine civilization—reverse… more

[ TIPS & TRICKS OF THE WEEK]

Save yourself from zombie apocalypse from unscalable models
One living and breathing zombie in today’s analytical models is the pulsating absence of error bars. Not every model is scalable or holds ground with increasing data. Error bars that is tagged to almost every models should be duly calibrated. As business models rake in more data the error bars keep it sensible and in check. If error bars are not accounted for, we will make our models susceptible to failure leading us to halloween that we never wants to see.

[ DATA SCIENCE Q&A]

Q:Provide a simple example of how an experimental design can help answer a question about behavior. How does experimental data contrast with observational data?
A: * You are researching the effect of music-listening on studying efficiency
* You might divide your subjects into two groups: one would listen to music and the other (control group) wouldn’t listen anything!
* You give them a test
* Then, you compare grades between the two groups

Differences between observational and experimental data:
– Observational data: measures the characteristics of a population by studying individuals in a sample, but doesn’t attempt to manipulate or influence the variables of interest
– Experimental data: applies a treatment to individuals and attempts to isolate the effects of the treatment on a response variable

Observational data: find 100 women age 30 of which 50 have been smoking a pack a day for 10 years while the other have been smoke free for 10 years. Measure lung capacity for each of the 100 women. Analyze, interpret and draw conclusions from data.

Experimental data: find 100 women age 20 who don’t currently smoke. Randomly assign 50 of the 100 women to the smoking treatment and the other 50 to the no smoking treatment. Those in the smoking group smoke a pack a day for 10 years while those in the control group remain smoke free for 10 years. Measure lung capacity for each of the 100 women.
Analyze, interpret and draw conclusions from data.

Source

[ VIDEO OF THE WEEK]

Using Analytics to build A #BigData #Workforce

 Using Analytics to build A #BigData #Workforce

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

You can use all the quantitative data you can get, but you still have to distrust it and use your own intelligence and judgment. – Alvin Tof

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

In that same survey, by a small but noticeable margin, executives at small companies (fewer than 1,000 employees) are nearly 10 percent more likely to view data as a strategic differentiator than their counterparts at large enterprises.

Sourced from: Analytics.CLUB #WEB Newsletter

IT jobs to shift to new tech, data analytics, cloud services

The $70-billion Indian IT export services industry, which recorded a 12.6 per cent year-on-year growth in FY 2015 and employs 1.2 million people, is all set to witness shrinkage in jobs over the next three years.

Shift in demand

While demand for traditional IT services will dip, demand for niche skills in next-generation technologies such as Cloud, Social, Mobility, Big Data, Analytics and Internet of Things (IoT) will result in a war for talent.

“Over a three year horizon the total number of jobs in IT services will witness a 10 per cent shrinkage as the legacy business continues to evaporate and next-generation technology services are in demand by global customers” Jaideep Mehta, Managing Director, IDC India and South Asia told BusinessLine.

“Low-end, low-skilled software developers and infrastructure maintenance engineers will become redundant. While IT services firms will endeavour to re-skill or re-shuffle this low-end talent and re-deploy them; those who are unable to transition into newer technologies will be no longer be relevant.

“New skill sets will be in demand, such as, Cloud architects, Security specialists, IT administrators, Analytics and IoT experts, etc, resulting in a war for the same talent,” Mehta predicts.

Concurring with Mehta, Rishi Das, CEO of CareerNet Consulting, said: “As systems become more intelligent with automation, artificial intelligence and big data analysis, IT productivity will improve by 10-20 percentage points and manpower addition will shrink by about 8-10 per cent over the next three years. For instance, customer support will move to a self-service model enabled via mobiles; coding jobs will also shrink as a lot of domain-specific software will be available on the Cloud, deployed and used with a little bit of customisation.”

K Harishankaran, co-founder of HackerRank, which has a database of 1 million Indian and global programmers on its platform, is also of the opinion that when hiring stabilises over the next 3-5 years, it could lead to a 10-15 per cent shrinkage in recruitment.

He points out that currently, demand for quality talent in engineering, product development, DevOps, Big Data, Analytics and Mobile apps development especially for iOS and Android apps, is on the rise but difficult to find.

Job creation

According to Nasscom data, the Indian IT & BPO industry added 200,000 jobs in FY 2015, of which 80 per cent were in traditional services.

Sangeeta Gupta, Senior Vice-President of Nasscom, admitted that the rate of job creation in traditional IT services will shrink as the industry gradually moves toward automation and digital technologies.

This will be offset by brisk hiring by product companies, start-ups, consumer internet firms and even large brick and mortal retail chains, that are all looking to invest in providing online and mobile access, she said.

To read the original article on The Hindu Business Line, click here.

Originally Posted at: IT jobs to shift to new tech, data analytics, cloud services

4 Reasons Data Driven Small Businesses Are Embracing Big Data

Small businesses are thinking bigger about their data – and it’s about time.

The term big data sounds intimidating – reserved only for the Fortune 500 leaders – but that could not be further from the reality of data analytics in the competitive small business market today.

Previously the exclusive domain of statisticians, large corporations and information technology departments, the emerging availability of data and analytics – call it a new democratization – gives small businesses and consumers greater access to cost-effective, sophisticated, data-powered tools and analytical systems.

For small businesses, big data will deliver meaningful insights on markets, competition and bottom-line business results for small businesses.

For small businesses and consumers, the big data revolution promises a wide range of benefits.

New Tech, New Rules

Today, big data is changing the rules of commerce and business operations, creating opportunities and challenges for small businesses. The convergence of three leading computing trends – cloud technologies, mobile technologies and social media – are creating cost-effective, data-rich platforms on which to build new businesses and drive economic growth for small and large businesses alike. This helps boost local economies as well as global e-commerce and trade.

Optimizing Insights

Digital data will continue to turbocharge the movement to understand analytics, in both small and large businesses. Proprietary data combined with data from the cloud will continue to create new insights and a deeper understanding of what consumers need, what they like and what will keep them happy.

The development of new data sources and unique analytics will drive entrepreneurial growth around the globe over the coming decade.

Better Management

Today, small businesses can leverage business management solutions, including Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) software platforms, to automate operational management tasks and keep better watch over their very own big data – including analytical views of sales and marketing campaigns.

Small businesses can stay on top of accounting, cash flow, budgets, balances and more with financial management software alternatives, as well as tools and applications for inventory management, project management, fleet management, human resources and more.

Real-Time Decisions

By optimizing real-time data analytics, small businesses today are capturing a better view of their administrative, sales and marketing practices – including real-time overviews of what’s working well and what needs scrutiny. Small businesses mining their own big data today routinely deploy a variety of solutions – most originating in the cloud – to improve operational and administrative efficiency and productivity, while reducing manual tasks and redundancies.

Today, small businesses are no longer intimidated by big data. They are embracing it to create and manage bigger opportunities for growth and profitability.

Today’s competitive small businesses realize that optimizing analytics and business intelligence allows them to recognize the full benefits of their very own big data – powering better marketing, sales and operational efficiency, productivity and functional gains. With data-driven tasks and decisions in the mix, a new culture of small business is emerging, powering greater opportunities for the small business community, its vendors and customers.

Angela Nadeau  is CEO of CompuData, an award-winning business technologies leader. Angela maintains a deep knowledge of the trends driving businesses today to be more productive and profitable by leveraging technology. With more than 25 years of expertise, she has advised thousands of businesses on effective ways to leverage technology to increase productivity, profitability and efficiency – guiding businesses of all sizes to new levels of market success and corporate growth.

Originally posted via “4 Reasons Data Driven Small Businesses Are Embracing Big Data

 

Originally Posted at: 4 Reasons Data Driven Small Businesses Are Embracing Big Data

Jul 06, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Statistically Significant  Source

[ AnalyticsWeek BYTES]

>> Apple partners with IBM on new health data analysis by anum

>> The Net Promoter Score: Let Us Not Forget The Past by bobehayes

>> The Practice of Customer Experience Management: Paper for a Tweet by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 Leak: Facebook targeting vulnerable youth with predatory ads – Northern Star Under  Sentiment Analysis

>>
 Best Hospital IT 2016: A pop health, precision medicine, value-based care reality check – Healthcare IT News Under  Health Analytics

>>
 SmartBase acquires assets of Group 3 Marketing 24 October 2016 – Research Magazine Under  Marketing Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

The Analytics Edge

image

This is an Archived Course
EdX keeps courses open for enrollment after they end to allow learners to explore content and continue learning. All features and materials may not be available, and course content will not be… more

[ FEATURED READ]

On Intelligence

image

Jeff Hawkins, the man who created the PalmPilot, Treo smart phone, and other handheld devices, has reshaped our relationship to computers. Now he stands ready to revolutionize both neuroscience and computing in one strok… more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:Why is naive Bayes so bad? How would you improve a spam detection algorithm that uses naive Bayes?
A: Naïve: the features are assumed independent/uncorrelated
Assumption not feasible in many cases
Improvement: decorrelate features (covariance matrix into identity matrix)

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData with Jon Gibs(@jonathangibs) @L2_Digital

 #BigData @AnalyticsWeek #FutureOfData with Jon Gibs(@jonathangibs) @L2_Digital

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data is not information, information is not knowledge, knowledge is not understanding, understanding is not wisdom. – Clifford Stoll

[ PODCAST OF THE WEEK]

#FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

 #FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

140,000 to 190,000. Too few people with deep analytical skills to fill the demand of Big Data jobs in the U.S. by 2018.

Sourced from: Analytics.CLUB #WEB Newsletter