Sep 19, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
SQL Database  Source

[ AnalyticsWeek BYTES]

>> 22 tips for better data science by analyticsweekpick

>> 6 Questions to Ask When Preparing Data for Analysis by analyticsweek

>> The UX of Brokerage Websites by analyticsweek

Wanna write? Click Here

[ FEATURED COURSE]

Tackle Real Data Challenges

image

Learn scalable data management, evaluate big data technologies, and design effective visualizations…. more

[ FEATURED READ]

Machine Learning With Random Forests And Decision Trees: A Visual Guide For Beginners

image

If you are looking for a book to help you understand how the machine learning algorithms “Random Forest” and “Decision Trees” work behind the scenes, then this is a good book for you. Those two algorithms are commonly u… more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:What is: collaborative filtering, n-grams, cosine distance?
A: Collaborative filtering:
– Technique used by some recommender systems
– Filtering for information or patterns using techniques involving collaboration of multiple agents: viewpoints, data sources.
1. A user expresses his/her preferences by rating items (movies, CDs.)
2. The system matches this user’s ratings against other users’ and finds people with most similar tastes
3. With similar users, the system recommends items that the similar users have rated highly but not yet being rated by this user

n-grams:
– Contiguous sequence of n items from a given sequence of text or speech
– ‘Andrew is a talented data scientist”
– Bi-gram: ‘Andrew is”, ‘is a”, ‘a talented”.
– Tri-grams: ‘Andrew is a”, ‘is a talented”, ‘a talented data”.
– An n-gram model models sequences using statistical properties of n-grams; see: Shannon Game
– More concisely, n-gram model: P(Xi|Xi?(n?1)…Xi?1): Markov model
– N-gram model: each word depends only on the n?1 last words

Issues:
– when facing infrequent n-grams
– solution: smooth the probability distributions by assigning non-zero probabilities to unseen words or n-grams
– Methods: Good-Turing, Backoff, Kneser-Kney smoothing

Cosine distance:
– How similar are two documents?
– Perfect similarity/agreement: 1
– No agreement : 0 (orthogonality)
– Measures the orientation, not magnitude

Given two vectors A and B representing word frequencies:
cosine-similarity(A,B)=?A,B?/||A||?||B||

Source

[ VIDEO OF THE WEEK]

Jeff Palmucci @TripAdvisor discusses managing a #MachineLearning #AI Team

 Jeff Palmucci @TripAdvisor discusses managing a #MachineLearning #AI Team

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

With data collection, ‘the sooner the better’ is always the best answer. – Marissa Mayer

[ PODCAST OF THE WEEK]

Solving #FutureOfWork with #Detonate mindset (by @steven_goldbach & @geofftuff) #JobsOfFuture #Podcast

 Solving #FutureOfWork with #Detonate mindset (by @steven_goldbach & @geofftuff) #JobsOfFuture #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Distributed computing (performing computing tasks using a network of computers in the cloud) is very real. Google GOOGL -0.53% uses it every day to involve about 1,000 computers in answering a single search query, which takes no more than 0.2 seconds to complete.

Sourced from: Analytics.CLUB #WEB Newsletter

Big Data Provides Big Insights for U.S. Hospitals

The U.S. government provides a variety of publicly available databases that include metrics on the performance of US hospitals, including patient experience (PX) database, health outcome database, process of care database and medical spending database. Applying Big Data principles on these disparate data sources, I integrated different metrics from their respective databases to better understand the quality of US hospitals and determine ways they can improve the patient experience and the overall healthcare delivery system. I spent the summer analyzing this data, and wrote many posts about it.

Why the Patient Experience (PX) has Become an Important Topic for U.S. Hospitals

The Centers for Medicare & Medicaid Services (CMS) will be using patient feedback about their care as part of their reimbursement plan for acute care hospitals (see Hospital Value-Based Purchasing (VBP) program). The purpose of the VBP program is to promote better clinical outcomes for patients and improve their experience of care during hospital stays. Not surprisingly, hospitals are focusing on improving the patient experience to ensure they receive the maximum of their incentive payments.

Key Findings from Analyses of Big Data of US Hospitals

Hospitals, like all big businesses, struggle with knowing “if you do this, then you will succeed with this.” While hospital administrators can rely on gut feelings, intuition and anecdotal evidence to guide their decisions on how to improve their hospitals, data-driven decision-making provides better, more reliable, insights about real things hospital administrators can do to improve their hospitals. While interpretation of my analyses of these Big Data are debatable, the data are what they are.

I have highlighted some key findings below (with accompanying blog posts) that provide value for different constituencies: 1) healthcare consumers can find the best hospitals, 2) healthcare providers can focus on areas that improve how they deliver healthcare, and 3) healthcare researchers can uncover deeper insights about factors that impact the patient experience and health outcomes.

  1. Healthcare Consumers Can Use Interactive Maps of US Hospital Ratings to Select the Best Provider. Healthcare consumers can use interactive maps to understand the quality of their hospitals with respect to three metrics: 1) Map of US hospitals on patient satisfaction, 2) Map of US hospitals on health outcomes, and 3) Map of US hospitals on process of care. Take a look at each to know how your hospital performs.
  2. Hospitals Can Use Patient Surveys to Improve Patient Loyalty. Hospitals might be focusing on the wrong areas to improve patient loyalty. While researchers found that hospitals’ top 3 priorities to improve the patient experience are focused on 1) reducing noise, 2) improving patient rounding and 3) the improving the discharge process and instructions, analysis of HCAHPS survey results show that hospitals will likely receive greater return on their improvement investment (ROI) if they focus on improving the patient experience along these dimensions: 1) pain management, 2) staff responsiveness and 3) staff explaining meds.
  3. There are Differences in the Patient Experience across Acute Care and Critical Access Hospitals. Acute care hospitals receive lower patient satisfaction ratings compared to critical access hospitals. Differences across these two types of hospitals also extends to ways to improve the patient experience. The key areas for improving patient loyalty/advocacy differ across hospital types. ACHs need to focus on 1) Staff explains meds, 2) Responsiveness and 3) Pain management. CAHs need to focus on 1) Pain management and 2) Responsiveness.
  4. Patient Satisfaction is Related to Health Outcomes and Process of Care Measures. The patient experience that had the highest correlation with readmission rates and process of care measures was “Given Information about my Recovery upon discharge“.  Hospitals who received good patient ratings on this dimension also experienced lower readmission rates and higher process of care scores compared to hospitals with poor patient ratings in this area.
  5. Medical Spending is Not Related to Patient Satisfaction. I found that hospitals with lower medical spend per patient are able to deliver a comparable patient experience to hospitals with greater medical spend per patient.

The value of insights gained from combining, integrating disparate databases (especially ones including both attitudinal and operational/objective metrics) provide much greater value than any single database can provide by itself.  That is one of the major values of using Big Data principles. The integrated health care Big Data set provided rich in insights and allowed us to answer bigger questions about how to best improve the patient experience and health outcomes.

Source: Big Data Provides Big Insights for U.S. Hospitals by bobehayes

Logi Tutorial: How to Integrate Logi with Microsoft Active Directory for Enhanced User Authentication

This post originally appeared on dbSeer, a business analytics consulting firm and Logi Analytics partner.

As an increasing number of companies are moving their infrastructure to Microsoft’s Azure, it seems natural to rely on its Active Directory for user authentication. Logi application users can also reap the benefits of this enterprise level security infrastructure without having to duplicate anything. Additionally, even smaller companies who use Office365 without any other infrastructure on the cloud, excluding email of course, can take advantage of this authentication.

Integrating Logi applications with Microsoft’s Active Directory produces two main benefits: attaining world class security for your Logi applications, and simplifying matters by having a single source of authentication. The following post describes how this integration is done.

1. Register Your Application with Microsoft

First, register your application with Azure Active Directory v. 2.0. This will allow us to request an access token from Microsoft for the user. To do this navigate to “https://apps.dev.microsoft.com/#/appList,” and click the “Add an app” button. After entering your application name, on the following page, click the “Add Platform” button and select “Web”. Under Redirect URLs, enter the URL of your website logon page (sample format: https:////.jsp). Microsoft does not support redirects to http sites, so your page must either use https or localhost. Make note of the redirect URL and application ID for the next step.

2. Create Custom Log-on Page for Logi Application

Microsoft allows users to give permissions to an application using their OAuth2 sign-in page. This process returns an access token, which has a name, email address, and several other pieces of information embedded within which we use to identify the user.

These next steps show you how to create a log-in page that redirects users to the Microsoft sign-in, retrieves the access token, and passes whichever value you want to use to identify the employee to Logi.

1) Download the rdLogonCustom.jsp file or copy paste the contents into a file. Place it in the base folder of your application.
2) Configure the following settings within the rdLogonCustom.jsp file to match your Logi application:

Change the ‘action’ element in the HTML body to the address of your main Logi application page:


Change the “redirectUri” and “appId” in the buildAuthUrl() function to match the information from your application registration with Azure AD v2.0:

The sample log-on page redirects the user to Microsoft’s page, allows the user to sign in before redirecting back to the log-on page. At the log-on page, it parses the token for the email address, passes the value to the authentication element using the hidden input to pass as a request parameter.

If you want to use a different value from the access token to identify the user, adjust the key in the “document.getElementById(‘email’).value = payload.” in the bottom of the custom logon file to match your desired value.

3. Configure Logi App

In your _Settings.lgx file, add a security element with the following settings:

*If your log-on page and failure page have different names, adjust accordingly.
Under the security element, add an authentication element with a data layer that uses the value found in @Request.email~ to identify the user. Optionally, you can add rights and roles elements to the security element as well.

In conclusion, utilizing this integration for your Logi applicatons can not only make your process more efficient by eliminating a duplicate authentication, but it can also provide for an added level of security because of Microsoft’s robust infrastructure.

Source: Logi Tutorial: How to Integrate Logi with Microsoft Active Directory for Enhanced User Authentication

Sep 12, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Ethics  Source

[ AnalyticsWeek BYTES]

>> All the Ways to Connect Your Azure SQL Data Warehouse with Talend by analyticsweekpick

>> How oil and gas firms are failing to grasp the necessity of Big Data analytics by analyticsweekpick

>> 5 Ways Manufacturing Analytics Will Change Your Business by analyticsweek

Wanna write? Click Here

[ FEATURED COURSE]

Statistical Thinking and Data Analysis

image

This course is an introduction to statistical data analysis. Topics are chosen from applied probability, sampling, estimation, hypothesis testing, linear regression, analysis of variance, categorical data analysis, and n… more

[ FEATURED READ]

Antifragile: Things That Gain from Disorder

image

Antifragile is a standalone book in Nassim Nicholas Taleb’s landmark Incerto series, an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision-making in a world we don’t understand. The… more

[ TIPS & TRICKS OF THE WEEK]

Keeping Biases Checked during the last mile of decision making
Today a data driven leader, a data scientist or a data driven expert is always put to test by helping his team solve a problem using his skills and expertise. Believe it or not but a part of that decision tree is derived from the intuition that adds a bias in our judgement that makes the suggestions tainted. Most skilled professionals do understand and handle the biases well, but in few cases, we give into tiny traps and could find ourselves trapped in those biases which impairs the judgement. So, it is important that we keep the intuition bias in check when working on a data problem.

[ DATA SCIENCE Q&A]

Q:Provide examples of machine-to-machine communications?
A: Telemedicine
– Heart patients wear specialized monitor which gather information regarding heart state
– The collected data is sent to an electronic implanted device which sends back electric shocks to the patient for correcting incorrect rhythms

Product restocking
– Vending machines are capable of messaging the distributor whenever an item is running out of stock

Source

[ VIDEO OF THE WEEK]

Pascal Marmier (@pmarmier) @SwissRe discusses running data driven innovation catalyst

 Pascal Marmier (@pmarmier) @SwissRe discusses running data driven innovation catalyst

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

He uses statistics as a drunken man uses lamp posts—for support rather than for illumination. – Andrew Lang

[ PODCAST OF THE WEEK]

George (@RedPointCTO / @RedPointGlobal) on becoming an unbiased #Technologist in #DataDriven World #FutureOfData #Podcast

 George (@RedPointCTO / @RedPointGlobal) on becoming an unbiased #Technologist in #DataDriven World #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Big data is a top business priority and drives enormous opportunity for business improvement. Wikibon’s own study projects that big data will be a $50 billion business by 2017.

Sourced from: Analytics.CLUB #WEB Newsletter

3 Big Data Stocks Worth Considering

Big data is a trend that I’ve followed for some time now, and even though it’s still in its early stages, I expect it to continue to be a game changer as we move further into the future.

smartphone tech sector news 3 Big Data Stocks Worth ConsideringAs our Internet footprint has grown, all the data we create — from credit cards to passwords and pictures uploaded on Instagram — has to be managed somehow.

This data is too vast to be entered into traditional relational databases, so more powerful tools are needed for companies to utilize the information to analyze customers’ behavior and predict what they may do in the future.

Big data makes it all possible, and as a result is one of the dominant themes for technology growth investing. We’ve invested in several of these types of companies in my GameChangers service over the years, one of which we’ll talk more about in just a moment.

First, let’s start with two of the biggest and best big data names out there. They’re among the best pure plays, and while I’m not sure the time is quite right to invest in either right now, they are both garnering some buzz in the tech world.

Big Data Stocks: Splunk (SPLK)

Splunk185 3 Big Data Stocks Worth ConsideringThe first is Splunk (SPLK). Splunk’s flagship product is Splunk Enterprise, which at its core is a proprietary machine data engine that enables dynamic creation on the fly. Users can then run queries on data without having to understand the structure of the information prior to collection and indexing.

Faster, streamlined processes mean more efficient (and more profitable) businesses.

While Splunk is very small in terms of revenues, with January 2015 fiscal year sales of just $451 million, it is growing rapidly, and I’m keeping an eye on the name as it may present a strong opportunity down the road.

However, I do not want to overpay for it. Splunk brings effective technology to the table that is gaining market acceptance, and has strong security software partners with its recent entry into security analytics. At the right price, the stock could also be a takeover candidate for a larger IT company looking to enhance its Big Data presence.

Big Data Stocks: Tableau Software (DATA)

TableauSoftware185 3 Big Data Stocks Worth ConsideringAnother name on my radar is Tableau Software (DATA), which performs similar functions as Splunk’s. Its primary product, VizQL, translates drag-and-drop actions into data queries. In this way, the company puts data directly in the hands of decision makers, without first having to go through technical specialists.

In fact, the company believes all employees, no matter what their rank in the company, can use their product, leading to the democratization of data.

DATA is also growing rapidly, even faster than Splunk. Revenues were up 78% in 2014, and 75% in the first quarter of 2015, including license revenue growth of more than 70%. That rate is expected to slow somewhat, with revenues for all of 2015 estimated to increase to a still strong 50%.

Tableau stock is also very expensive, trading at 12X expected 2015 revenues of $618 million and close to 300X projected EPS of 40 cents for the year. DATA is a little risky to buy at current levels, but it is a name to keep an eye on in any pullback.

Big Data Stocks: Red Hat (RHT)

red hat rht stock logo 185 3 Big Data Stocks Worth ConsideringThe company we made money on earlier this year in my GameChangers service isRed Hat (RHT). We booked a 15% profit in just a few months after it popped 11% on fourth-quarter earnings.

Red Hat is the world’s largest leading provider of open-source solutions, providing software to 90% of Fortune 500 companies. Some of RHT’s customers include well-known names like Sprint (S), Adobe Systems (ADBE) and Cigna Corporation (CI).

Management’s goal is to become the undisputed leader of enterprise cloud computing, and it sees its popular Linux operating system as a way to the top. If RHT is successful — as I expect it will be — Red Hat should have a lengthy period of expanded growth as corporations increasingly move into the cloud.

Red Hat’s operating results had always clearly demonstrated that its solutions are gaining greater acceptance in IT departments, as revenues had more doubled in the five years between 2009 and 2014 from $748 million to $1.53 billion. I had expected to see the strong sales growth continue throughout 2015, and it did. As I mentioned, impressive fiscal fourth-quarter results sent the shares 11% higher.

I recommended my subscribers sell their stake in the company at the end of March because I believed any further near-term upside was limited. Since then, shares have traded mostly between $75 and $80. It is now at the very top of that range and may be on the verge of breaking above it after the company reported fiscal first-quarter results last night.

Although orders were a little slow, RHT beat estimates on both the top and bottom lines in the first quarter. Earnings of 44 cents per share were up 29% quarter-over-quarter, besting estimates on the Street for earnings of 41 cents. Revenue climbed 14% to $481 million, while analysts had been expecting $472.6 million.

At this point, RHT is now back in uncharted territory, climbing to a new 52-week high earlier today. This is a company with plenty of growth opportunities ahead, and while growth may slow a bit in the near term following the stock’s impressive climb so far this year, RHT stands to gain as corporation continue to adopt additional cloud technologies.

To read the original article on InvestorPlace, click here.

Source

Sep 05, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data analyst  Source

[ AnalyticsWeek BYTES]

>> Out of the Loop on the Internet of Things? Here’s a Brief Guide. by analyticsweekpick

>> Three final talent tips: how to hire data scientists by analyticsweekpick

>> Measuring The Customer Experience Requires Fewer Questions Than You Think by bobehayes

Wanna write? Click Here

[ FEATURED COURSE]

Machine Learning

image

6.867 is an introductory course on machine learning which gives an overview of many concepts, techniques, and algorithms in machine learning, beginning with topics such as classification and linear regression and ending … more

[ FEATURED READ]

Introduction to Graph Theory (Dover Books on Mathematics)

image

A stimulating excursion into pure mathematics aimed at “the mathematically traumatized,” but great fun for mathematical hobbyists and serious mathematicians as well. Requiring only high school algebra as mathematical bac… more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:What are confounding variables?
A: * Extraneous variable in a statistical model that correlates directly or inversely with both the dependent and the independent variable
* A spurious relationship is a perceived relationship between an independent variable and a dependent variable that has been estimated incorrectly
* The estimate fails to account for the confounding factor

Source

[ VIDEO OF THE WEEK]

#GlobalBusiness at the speed of The #BigAnalytics

 #GlobalBusiness at the speed of The #BigAnalytics

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data is the new science. Big Data holds the answers. – Pat Gelsinger

[ PODCAST OF THE WEEK]

#FutureOfData with @theClaymethod, @TiVo discussing running analytics in media industry

 #FutureOfData with @theClaymethod, @TiVo discussing running analytics in media industry

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

A quarter of decision-makers surveyed predict that data volumes in their companies will rise by more than 60 per cent by the end of 2014, with the average of all respondents anticipating a growth of no less than 42 per cent.

Sourced from: Analytics.CLUB #WEB Newsletter

The Best Big Data Applications for Financial Services

Today’s financial service providers operate almost entirely online, and every single transaction and penny transmitted creates hundreds of data points. These massive amounts of data can be incredibly valuable when properly processed, scrubbed, and analyzed. Most importantly, they can help financial services providers make smarter, faster decisions that are backed more by data than intuition.

Even so, it’s important to know which data is valuable to track, and the best ways to measure it. The foundation of most smart financial dashboards is a clear understanding of the metrics and key performance indicators necessary for discerning important insights. More importantly, the process to finding the right uses for big data begins with understanding where your data comes from, and how you can measure it best.

Find The Right Data Streams and KPIs

Creating a powerful financial analytics tool begins with knowing your data sources and identifying the right ways to track them. Data comes from a variety of channels and can prove useful for different types of analysis and metrics.

  • Expenses and Revenues – Regardless of the specific services you provide, every company has overheads and revenues that provide important historic and trend data. This data can include payrolls, income from transactions, operational costs, and fixed costs such as leases, utility costs, and debt payments.
  • Transactions – Volumes of transactions alone are not a reliable measure of a business’ success. A series of investments that cost more than they earn isn’t made better with bigger numbers. Transaction data can display the cost of every deal or investment as well as its margins.
  • Periodic Financial Data – For financial services companies like investment funds, banks, and lenders, understanding growth over time (with measures like cumulative annual growth rate) is crucial for updating investment and lending strategies, as well as building new ones.
  • Shareholder Data – This stream includes information about shareholders’ benefits, dividends, and earnings. Information about dividends paid as a percentage of profits, earnings per share, and even the value of shares provides useful data for making major financial decisions.
  • Debt Data – For most financial services providers, client debt can represent both an asset and a liability. Lenders must keep track of outstanding debts, paid ones, and interest rates. It is also an important consideration when making investments, expanding, setting future strategies and lending levels, and understanding the impact of transactions and loan decisions.

Examples of Data Visualizations in Financial Services

The most valuable visualizations for financial services providers deliver a clear understanding of historic data and offer potential trends and insights. Some of the most relevant big data use cases in financial services focus around the performance of long-term assets like investments, loans, and other financial products. As such, the right big data analytics tools feature a combination of short and long-term data visualizations that provide a more comprehensive view of a financial services company’s performance:

  • Profit & Loss – Profit and loss (P&L) visualizations help build your understanding of profits and costs on an historic basis. This allows you to track real-time changes and measure the impact of strategy changes and new policies. Moreover, P&L visualizations help you identify where profit margins can be widened and which areas of the business are more profitable and valuable relative to others.

  • Profit & Loss - Financial Dashboard


  • Loan Volumes and Outstanding Loans – Visualizing the number of outstanding loans helps providers form a better understanding of policy efficacy and which customers are more likely to pay on time. Interpreting the number of loans can indicate consumer preferences and possible areas for heightened profitability. Comparing it to outstanding and delinquent loans can help highlight the effectiveness of underwriting processes and loan extensions.
  • Investment Portfolio Performance – Most financial services providers keep their funds in a combination of cash on hand and diversified investments. Tracking investment portfolios’ performances can help you allocate resources to optimize profits while concurrently advancing better liquidity and financial stability over the long-term. It can also highlight under-performing assets and lead to better reinvestment decisions.

  • Investment Portfolio- Financial Dashboard


  • Relative Income Per Customer Type – Banks, lenders, and other financial services providers cater to a variety of customers and needs. Understanding each revenue stream’s profitability helps companies make better decisions about which customers to pursue more aggressively and where to allocate more resources. Additionally, it can highlight trends among customer types for services such as loans and mortgages.

Smart Financial Dashboard Design

The best financial services dashboard includes both high-level and granular visualizations that provide a fuller picture. By understanding how specific areas are performing (such as outstanding loans, income per customer type, and more) in conjunction with broader data such as P&L and CAGR, you can make smarter strategic decisions and allocate your company’s resources in a manner that leads to operational and profitability optimization.

Originally Posted at: The Best Big Data Applications for Financial Services by analyticsweek

Aug 29, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Accuracy check  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> How can you reap the advantages of Big Data in your enterprise? Services you can expect from a Remote DBA Expert by thomassujain

>> Why Organizations Are Choosing Talend vs Informatica by analyticsweekpick

>> Predicting UX Metrics with the PURE Method by analyticsweek

Wanna write? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

The Industries of the Future

image

The New York Times bestseller, from leading innovation expert Alec Ross, a “fascinating vision” (Forbes) of what’s next for the world and how to navigate the changes the future will bring…. more

[ TIPS & TRICKS OF THE WEEK]

Data Have Meaning
We live in a Big Data world in which everything is quantified. While the emphasis of Big Data has been focused on distinguishing the three characteristics of data (the infamous three Vs), we need to be cognizant of the fact that data have meaning. That is, the numbers in your data represent something of interest, an outcome that is important to your business. The meaning of those numbers is about the veracity of your data.

[ DATA SCIENCE Q&A]

Q:How do you control for biases?
A: * Choose a representative sample, preferably by a random method
* Choose an adequate size of sample
* Identify all confounding factors if possible
* Identify sources of bias and include them as additional predictors in statistical analyses
* Use randomization: by randomly recruiting or assigning subjects in a study, all our experimental groups have an equal chance of being influenced by the same bias

Notes:
– Randomization: in randomized control trials, research participants are assigned by chance, rather than by choice to either the experimental group or the control group.
– Random sampling: obtaining data that is representative of the population of interest

Source

[ VIDEO OF THE WEEK]

@AnalyticsWeek: Big Data Health Informatics for the 21st Century: Gil Alterovitz

 @AnalyticsWeek: Big Data Health Informatics for the 21st Century: Gil Alterovitz

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data matures like wine, applications like fish. – James Governor

[ PODCAST OF THE WEEK]

#FutureOfData with @CharlieDataMine, @Oracle discussing running analytics in an enterprise

 #FutureOfData with @CharlieDataMine, @Oracle discussing running analytics in an enterprise

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

This year, over 1.4 billion smart phones will be shipped – all packed with sensors capable of collecting all kinds of data, not to mention the data the users create themselves.

Sourced from: Analytics.CLUB #WEB Newsletter