Data Discovery Tools: also called Data Visualization Tool, sometimes also referred to as Data Analytics tools. These tools are talk of the town and reasonably hot today. With all the hype about Big-data, companies are going dandy on planning for their big-data and are on a lookout for great tools to recruit in their big-data strategy. One thing to note here is that we don’t change our data discovery tool vendors every day. Once we get into our system, it will eventually become part of our Big-data DNA Solution. So, we should put much thought into what goes in picking data discovery/visualization/analysis tool.
So, what would you do and what would you consider important while picking up your data discovery tool vendor? I interviewed a couple of data scientists and data managers in a bunch of companies and prioritized the findings with top 7 things to consider before you go out picking your big-data discovery tools vendor.
Here are my 7 thoughts not particularly in that order:
1. Not a major jump with what I already have: Yes, learning a new system takes time, effort, resources and cycles. So, the faster the ramp up or shorter the learning curve, the better it will be. Sure, many tools will be eons apart with what you are used to, but that should not deter you from evaluating it as well. Just go a bit high on the tools with minimum learning curve. One thing to check here is that you should be able to do routine things with this new tool, almost the same way as you used to doing without it.
2. Helps me to do more with my data: There will be several moments where you will realize the tools could do a lot more than what you are used to or what you are capable of. This is another check to include in your equation. More feature set, capabilities within the discovery tool. The more it will let you do stuff with your data, the better it will be. You should be able to do more fun and investigative stuffs with your data more closely and at various dimensions that it will ultimately help with better understanding of the data.
3. Integrate well with my big-data: Yes, first things first, you need a tool that at least has the capability to talk to your data. It should be able to mingle well with your data layouts, structures without having to do too much time consuming steps. A good tool will always make it almost seamless to integrate your data. If you have to jump ropes, cut corners to make data integration happen, maybe you are looking at a wrong tool for help. So, get your data integration team to work and make sure data integration is a no issue with the tool that you are evaluating to buy.
4. Friendly with outside data I might include as well: Many-times, it is not only about your data. Sometime you need to access and evaluate external data and find their relationship with your data. Those used case must be checked as well. How easy it is to include external structured and unstructured data. The bigger the product integration roadmap for the vendor, the easier it will be for the tool to connect with other external resources. Your preferred tools should be able to integrate seamlessly with data sets involved in your industry. Social data, industry data, other third party application data are some example. So, ask your vendor on how their tools mingle well with other outside data sets.
5. Scalability of the platform: Sure, the tool you are evaluating could do wonders with data and has a sweet feature set, but will it scale well as you grow. This is important consideration just like any good corporate tool considerations. If your business will grow, so will itâs data and associated dependencies, but will your discovery tool grow with it? This is one finding which must be part of your evaluation score for any tool you are planning to recruit for your big-data discovery need. So, get on call with technical teams from vendor and grill them to understand how their tool will grow with growing data. You donât want to partner with a tool that will break in future as your business grows.
6. Vendorâs vision is in-line with our vision: Above 5 measures are pretty much standard and defines basic functionalities on what a good tool should entail. Itâs also not a big surprise that most of the tools will have some or the other of their own interpretation of the above 5 points. Now one key thing to notice on strategic front will be their vision for the company and the tool. Tool can do you good today, it has boatload of features, it is friendly with your and outside data. But will it grow with a strategy consistent with yours. Yes, no matter how weird it sounds, it is one of the realities that you should consider. A vendor only handling health care will have some impact to companies using the tools for insurance sector. A tool that will handle only clever visualization piece might have impact on companies expecting some automation as part of the core tool evolution. So, it is important to understand the product vision of the tool company, that will help you understand if it will comply with your business value tomorrow or day-after or in foreseeable future.
7. Awesome import / export tools to keep my data/analysis free: Another important thing to note is stickiness with the products. A good product design should not keep customer sticky by keeping their data hostage. A good tool should bank on itâs features, usability and data driven design. So, data and itâs knowledge should be easily importable/exportable to most common standards (csv, xml etc.). This will keep the tool up with integrating it with other third party service that might emerge with emerging market. This should be a consideration as it will play an instrumental role in moving your data around as you start dealing with new formats and new reporting tools that are leveraging your data discovery findings.
I am certain by the end of 7 steps you must have thought about several more examples that one could keep in mind before picking a good data discovery tool. Feel free to email me your findings and I will keep adding it to the list.
Creating charts and info graphics can be time-consuming. But these tools make it easier.
It’s often said that data is the new world currency, and the web is the exchange bureau through which it’s traded. As consumers, we’re positively swimming in data; it’s everywhere from labels on food packaging design to World Health Organisation reports. As a result, for the designer it’s becoming increasingly difficult to present data in a way that stands out from the mass of competing data streams.
One of the best ways to get your message across is to use a visualization to quickly draw attention to the key messages, and by presenting data visually it’s also possible to uncover surprising patterns and observations that wouldn’t be apparent from looking at stats alone.
As author, data journalist and information designer David McCandless said in his TED talk: “By visualizing information, we turn it into a landscape that you can explore with your eyes, a sort of information map. And when you’re lost in information, an information map is kind of useful.”
There are many different ways of telling a story, but everything starts with an idea. So to help you get started we’ve rounded up some of the most awesome data visualization tools available on the web.
If you’re looking for a data viz tool with mapping, InstantAtlas is worth checking out. This tool enables you to create highly-interactive dynamic and profile reports that combine statistics and map data to create engaging data visualizations.
Timeline is a fantastic widget which renders a beautiful interactive timeline that responds to the user’s mouse, making it easy to create advanced timelines that convey a lot of information in a compressed space.
Each element can be clicked to reveal more in-depth information, making this a great way to give a big-picture view while still providing full detail.
Developed by MIT, and fully open-source, Exhibit makes it easy to create interactive maps, and other data-based visualizations that are orientated towards teaching or static/historical based data sets, such as flags pinned to countries, or birth-places of famous people.
Modest Maps is a lightweight, simple mapping tool for web designers that makes it easy to integrate and develop interactive maps within your site, using them as a data visualization tool.
The API is easy to get to grips with, and offers a useful number of hooks for adding your own interaction code, making it a good choice for designers looking to fully customise their user’s experience to match their website or web app. The basic library can also be extended with additional plugins, adding to its core functionality and offering some very useful data integration options.
Another mapping tool, Leaflet makes it easy to use OpenStreetMap data and integrate fully interactive data visualisation in an HTML5/CSS3 wrapper.
The core library itself is very small, but there are a wide range of plugins available that extend the functionality with specialist functionality such as animated markers, masks and heatmaps. Perfect for any project where you need to show data overlaid on a geographical projection (including unusual projections!).
Billed as a “computational knowledge engine”, the Google rival WolframAlpha is really good at intelligently displaying charts in response to data queries without the need for any configuration. If you’re using publically available data, this offers a simple widget builder to make it really simple to get visualizations on your site.
Visual.ly is a combined gallery and infographic generation tool. It offers a simple toolset for building stunning data representations, as well as a platform to share your creations. This goes beyond pure data visualisation, but if you want to create something that stands on its own, it’s a fantastic resource and an info-junkie’s dream come true!
Visualize Free is a hosted tool that allows you to use publicly available datasets, or upload your own, and build interactive visualizations to illustrate the data. The visualizations go well beyond simple charts, and the service is completely free plus while development work requires Flash, output can be done through HTML5.
Orientated towards making positive change to the world, Better World Flux has some lovely visualizations of some pretty depressing data. It would be very useful, for example, if you were writing an article about world poverty, child undernourishment or access to clean water. This tool doesn’t allow you to upload your own data, but does offer a rich interactive output.
Another jQuery plugin, jqPlot is a nice solution for line and point charts. It comes with a few nice additional features such as the ability to generate trend lines automatically, and interactive points that can be adjusted by the website visitor, updating the dataset accordingly.
Dipity allows you to create rich interactive timelines and embed them on your website. It offers a free version and a premium product, with the usual restrictions and limitations present. The timelines it outputs are beautiful and fully customisable, and are very easy to embed directly into your page.
Developed by IBM, Many Eyes allows you to quickly build visualizations from publically available or uploaded data sets, and features a wide range of analysis types including the ability to scan text for keyword density and saturation. This is another great example of a big company supporting research and sharing the results openly.
If you need to generate charts and graphs server-side, jpGraph offers a PHP-based solution with a wide range of chart types. It’s free for non-commercial use, and features extensive documentation. By rendering on the server, this is guaranteed to provide a consistent visual output, albeit at the expense of interactivity and accessibility.
The seminal charting solution for much of the web, Google Charts is highly flexible and has an excellent set of developer tools behind it. It’s an especially useful tool for specialist visualizations such as geocharts and gauges, and it also includes built-in animation and user interaction controls.
You can actually do some pretty complex things with Excel, from ‘heat maps’ of cells to scatter plots. As an entry-level tool, it can be a good way of quickly exploring data, or creating visualizations for internal use, but the limited default set of colours, lines and styles make it difficult to create graphics that would be usable in a professional publication or website. Nevertheless, as a means of rapidly communicating ideas, Excel should be part of your toolbox.
Excel comes as part of the commercial Microsoft Office suite, so if you don’t have access to it, Google’s spreadsheets – part ofGoogle Docs and Google Drive – can do many of the same things. Google ‘eats its own dog food’, so the spreadsheet can generate the same charts as the Google Chart API. This will get your familiar with what is possible before stepping off and using the API directly for your own projects.
Dragging on variables enables you to increase or decrease their values and see an accompanying chart update automatically. The results are only just short of magical.
Polymaps is a mapping library that is aimed squarely at a data visualization audience. Offering a unique approach to styling the the maps it creates, analagous to CSS selectors, it’s a great resource to know about.
OpenLayers is probably the most robust of these mapping libraries. The documentation isn’t great and the learning curve is steep, but for certain tasks nothing else can compete. When you need a very specific tool no other library provides, OpenLayers is always there.
Kartograph’s tag line is ‘rethink mapping’ and that is exactly what its developers are doing. We’re all used to the Mercator projection, but Kartograph brings far more choices to the table. If you aren’t working with worldwide data, and can place your map in a defined box, Kartograph has the options you need to stand out from the crowd.
CartoDB is a must-know site. The ease with which you can combine tabular data with maps is second to none. For example, you can feed in a CSV file of address strings and it will convert them to latitudes and longitudes and plot them on a map, but there are many other users. It’s free for up to five tables; after that, there are monthly pricing plans.
Processing has become the poster child for interactive visualizations. It enables you to write much simpler code which is in turn compiled into Java.
There is also a Processing.js project to make it easier for websites to use Processing without Java applets, plus a port to Objective-C so you can use it on iOS. It is a desktop application, but can be run on all platforms, and given that it is now several years old, there are plenty of examples and code from the community.
NodeBox is an OS X application for creating 2D graphics and visualizations. You need to know and understand Python code, but beyond that it’s a quick and easy way to tweak variables and see results instantly. It’s similar to Processing, but without all the interactivity.
How many other pieces of software have an entire search enginededicated to them? A statistical package used to parse large data sets, R is a very complex tool, and one that takes a while to understand, but has a strong community and package library, with more and more being produced.
The learning curve is one of the steepest of any of these tools listed here, but you must be comfortable using it if you want to get to this level.
When you get deeper into being a data scientist, you will need to expand your capabilities from just creating visualizations to data mining. Weka is a good tool for classifying and clustering data based on various attributes – both powerful ways to explore data – but it also has the ability to generate simple plots.
When people talk about relatedness, social graphs and co-relations, they are really talking about how two nodes are related to one another relative to the other nodes in a network. The nodes in question could be people in a company, words in a document or passes in a football game, but the maths is the same.
Gephi, a graph-based visualiser and data explorer, can not only crunch large data sets and produce beautiful visualizations, but also allows you to clean and sort the data. It’s a very niche use case and a complex piece of software, but it puts you ahead of anyone else in the field who doesn’t know about this gem.
The iCharts service provides a hosted solution for creating and presenting compelling charts for inclusion on your website. There are many different chart types available, and each is fully customisable to suit the subject matter and colour scheme of your site.
Charts can have interactive elements, and can pull data from Google Docs, Excel spreadsheets and other sources. The free account lets you create basic charts, while you can pay to upgrade for additional features and branding-free options.
Flot is a specialised plotting library for jQuery, but it has many handy features and crucially works across all common browsers including Internet Explorer 6. Data can be animated and, because it’s a jQuery plugin, you can fully control all the aspects of animation, presentation and user interaction. This does mean that you need to be familiar with (and comfortable with) jQuery, but if that’s the case, this makes a great option for including interactive charts on your website.
That said, it’s a bit more hands-on than some of the other tools featured here (a victim of being so flexible), so unless you’re a hardcore coder, you might want to check out some of the more point-and-click orientated options first!
Written by the team behind jQuery’s ThemeRoller and jQuery UI websites, jQuery Visualize Plugin is an open source charting plugin for jQuery that uses HTML Canvas to draw a number of different chart types. One of the key features of this plugin is its focus on achieving ARIA support, making it friendly to screen-readers. It’s free to download from this page on GitHub.
About the speaker:
SamÂ SelikoffÂ [ http://www.samselikoff.com/ | @samselikoff ] is a self-taught full-stack web developer. Formerly a graduate student of economics and finance, he unexpectedly discovered a passion for programming while doing data work for a consulting firm. He is currently focusing on client-side MVC and data visualization.
The $70-billion Indian IT export services industry, which recorded a 12.6 per cent year-on-year growth in FY 2015 and employs 1.2 million people, is all set to witness shrinkage in jobs over the next three years.
Shift in demand
While demand for traditional IT services will dip, demand for niche skills in next-generation technologies such as Cloud, Social, Mobility, Big Data, Analytics and Internet of Things (IoT) will result in a war for talent.
âOver a three year horizon the total number of jobs in IT services will witness a 10 per cent shrinkage as the legacy business continues to evaporate and next-generation technology services are in demand by global customersâ Jaideep Mehta, Managing Director, IDC India and South Asia told BusinessLine.
âLow-end, low-skilled software developers and infrastructure maintenance engineers will become redundant. While IT services firms will endeavour to re-skill or re-shuffle this low-end talent and re-deploy them; those who are unable to transition into newer technologies will be no longer be relevant.
âNew skill sets will be in demand, such as, Cloud architects, Security specialists, IT administrators, Analytics and IoT experts, etc, resulting in a war for the same talent,â Mehta predicts.
Concurring with Mehta, Rishi Das, CEO of CareerNet Consulting, said: âAs systems become more intelligent with automation, artificial intelligence and big data analysis, IT productivity will improve by 10-20 percentage points and manpower addition will shrink by about 8-10 per cent over the next three years. For instance, customer support will move to a self-service model enabled via mobiles; coding jobs will also shrink as a lot of domain-specific software will be available on the Cloud, deployed and used with a little bit of customisation.â
K Harishankaran, co-founder of HackerRank, which has a database of 1 million Indian and global programmers on its platform, is also of the opinion that when hiring stabilises over the next 3-5 years, it could lead to a 10-15 per cent shrinkage in recruitment.
He points out that currently, demand for quality talent in engineering, product development, DevOps, Big Data, Analytics and Mobile apps development especially for iOS and Android apps, is on the rise but difficult to find.
According to Nasscom data, the Indian IT & BPO industry added 200,000 jobs in FY 2015, of which 80 per cent were in traditional services.
Sangeeta Gupta, Senior Vice-President of Nasscom, admitted that the rate of job creation in traditional IT services will shrink as the industry gradually moves toward automation and digital technologies.
This will be offset by brisk hiring by product companies, start-ups, consumer internet firms and even large brick and mortal retail chains, that are all looking to invest in providing online and mobile access, she said.
To read the original article on The Hindu Business Line, click here.
Small businesses are thinking bigger about their data â and itâs about time.
The term big dataÂ sounds intimidating â reserved only for the Fortune 500 leaders â but that could not be further from the reality of data analytics in the competitive small businessÂ market today.
Previously the exclusive domain of statisticians, large corporations and information technology departments, the emerging availability of data and analytics â call it a new democratization â gives small businesses and consumers greater access to cost-effective, sophisticated, data-powered tools and analytical systems.
For small businesses, big dataÂ will deliver meaningful insights on markets, competition and bottom-line business results for small businesses.
For small businesses and consumers, the big data revolution promises a wide range of benefits.
New Tech, New Rules
Today, big data is changing the rules of commerce and business operations, creating opportunities and challenges for small businesses. The convergence of three leadingÂ computing trends â cloud technologies, mobile technologies and social mediaÂ â are creatingÂ cost-effective, data-rich platforms on which to build new businesses and drive economic growth for small and large businesses alike. This helpsÂ boostÂ local economies as well as global e-commerce and trade.
Digital data will continue to turbocharge the movement to understandÂ analytics, in both small and large businesses. Proprietary data combined with data from the cloud will continue to create new insights and a deeper understanding of what consumers need, what they like and what will keep them happy.
The development of new data sources and unique analytics will drive entrepreneurial growth around the globe over the coming decade.
Today, small businessesÂ can leverage business management solutions, including Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) software platforms, to automate operational management tasks and keep better watch over their very own big data â including analytical views of sales and marketing campaigns.
Small businesses can stay on top of accounting, cash flow, budgets, balances and more with financial management software alternatives, as well as tools and applications for inventory management, project management, fleet management, human resources and more.
By optimizing real-time data analytics, small businesses today are capturing a better view of theirÂ administrative, sales and marketing practices â including real-time overviews of whatâs working well and what needs scrutiny. Small businesses mining their own big data today routinely deploy a variety of solutions â most originating in the cloud â to improve operational and administrative efficiency and productivity, while reducing manual tasks and redundancies.
Today, small businesses are no longer intimidated by big data. They are embracing it to create and manage bigger opportunities for growth and profitability.
Todayâs competitive small businesses realize that optimizing analytics and business intelligenceÂ allows them to recognizeÂ the full benefitsÂ of their very ownÂ big data â powering better marketing, sales and operational efficiency, productivity and functional gains. With data-driven tasks and decisions in the mix, a new culture of small businessÂ is emerging, powering greater opportunities for the small business community, its vendors and customers.
Angela NadeauÂ Â is CEO of CompuData, an award-winning business technologies leader. Angela maintains a deep knowledge of the trends driving businesses today to be more productive and profitable by leveraging technology. With more than 25 years of expertise, she has advised thousands of businesses on effective ways to leverage technology to increase productivity, profitability and efficiency â guiding businesses of all sizes to new levels of market success and corporate growth.
Customer Experience Management (CEM) programs are complex, data intensive programs. To be successful, you need to effectively communicate information about that program to important stakeholders, including employees, partners, and customers. We know that loyalty leading companies communicate customer initiatives throughout the company, from top executives to front-line employees. Â For example, in aÂ best practices study on customer feedback programs, I found that loyalty leading companies, compared to loyalty lagging companies,Â communicate a lot about their CEM program, including itsÂ goals, processes and results.
Communications about the CEM program need to be tailored to specific audiences’ needs. Communication of CEM elements to other stakeholders (employees, partners and customers) looks a lot different as their needs are different. To build a company around the customer, communication to other constituencies needs to help to communicate two things: 1) what’s important to customers and 2) What you are doing to improve the customer experience.
1. What Matters to the Customers
To build a company around the customer and their needs, each employee and partner need to know which customer touch points are important. Knowing what’s important to customers, employees are better equipped to deliver a customer experience that meets or exceeds the customers’ expectations. Senior executives, for example, receive the results of driver analysis whichÂ helps them in strategic decision making. The front-line employees could receiveÂ individual customer resultsÂ as well as aggregated customer results. ThisÂ micro and macroÂ level of reporting ensures employees understand specific customer concerns as well as customer concerns on a wider scale.
Be sureÂ to make CEM results interesting.Â Bar charts are boring. I know because I make them. Consider employing the use ofÂ word cloudsÂ to summarize the customers’ open-ended comments. Word clouds let you present frequency data in a fun, interesting way. Depending how the customer survey question is phrased, the resulting word cloud have different meanings. For example, a word cloud based on the question “What improvements would you make to the products/service?” would help employees understand that the key business areas that are important to customers. Additionally, a word cloud based on the question “What would you say to somebody considering purchasing our products/solutions?” would help employees understand how to better market and sell solutions. Check out theseÂ examplesÂ of word clouds you could use as a starting point for your CEM program.
2. What You are Doing to Improve the Customer Experience
Companies collect customer feedback in order to make customer experience improvements in areas that matter to the customers. Oftentimes, these improvements come in the form of process and operational changes that impact a large group of customers. Keep track of these operational changes and share them with different constituencies.Â For each change, keep track of the reasons behind the change (driven by the company or driven by customer feedback) to optimize their use as examples in marketing collateral or customer communications.
The list of changes reflect specific systemic changes designed to improve the customer experience. When creating this list, be specific in your description. Don’t just say, “Improved marketing process.” Instead, a more precise approach includes the specific problem being addressed, the expected/actual change (if you have data) and the specific customer lifecycle phase that was impacted (“Improved email communications process to ensure customers are not receiving unwanted communications.”). Â To generate this list of changes, you can solicit customer experience improvement examples from your employees or established improvement teams. Keep a list of these accomplishments/changes in a central location to facilitate dissemination of them throughout the company.
Benefits of Sharing
You will receive many diverse benefits from sharing elements of the customer experience management program. Here are four benefits:
Improve survey response rates.Â Is there anything more frustrating to a customer than not being listened to? Given that customers use their valuable time to provideÂ youÂ feedback to improveÂ yourÂ processes, the least you can do is let them know you are listening to them. In your next feedback invitation letter, include the list the specific things your company is doing differently as a result of the customer feedback.Â Include about 3-5 of these accomplishments in the survey invitation.Â Sharing information with the customers about how their feedback is impacting how you run your business shows your customers that providing feedback is not a waste of their time; their words result in real and meaningful operational changes.
Improve marketing and sales approach.Â The list of accomplishments as well as the word clouds, when shared with and digested by the marketing and sales teams, can improve the marketing and sales communications/collateral Â (especially if responses, in general, reflectÂ high praise).
Brand your CEM program.Â Customer-centric changes as well as word clouds could be used for branding your CEM program internally and can be included in annual reports when discussing your overall CEM program with employees and company shareholders.
Build customer-centric culture.Â Sharing customer feedback results across all levels of the company helps build a strong customer-centric culture. It is important to keep the customers and their needs in the minds of employees. Aggregating and disseminating customer feedback in the form of artful word clouds canÂ go a long way in accomplishing this goal. Additionally, listing the ways your company has changed as a direct result of customer feedback helps employees know that senior management is serious about using customer feedback in process improvement efforts. Including these success stories on the company intranet site (e.g., employee portal) can provide the impetus for other employees to think of ways of improving the customer experience.
Loyalty leading companies share all aspects of their CEM program to different constituencies. Â To have a successful CEM program, communicate different aspects of your CEM program to different stakeholders. Two important elements to communicate are: 1) knowledge of customers’ needs and 2) the changes you are making to improve the customer experience. The value of these two types of information is seen in improvements in marketing, sales and support, as well as the CEM program itself through increased customer feedback.
Let customers know you are listening to their feedback by simply telling them how their feedback is helping you make changes to improve the customer experience. Employees need to know the importance of the customer and their needs. Sharing important customer themes across all levels of the company helps employees understand what is important to customers and helps set expectations about their performance. Don’t do a data dump on the consumers of your CEM data, and use creative ways to convey information that is both truthful and compelling.
No, we are not talking about mobile strategy for retailers but a sub-part of it, that is – Mobile strategy for brick-mortar stores. Yes, it is different from the overall mobile strategy for the retailer and no, it cannot be done correctly without thinking differently from online store strategy. With ever increasing number of smartphone and affordable data plans, it would be a bad move not to think about mobile strategy for retailers. I hope retailers are already aware that it is critical for their business to have a mobile strategy. There are tons of material already written to suggest why. What I have yet to discover is separate mobile strategy for retailer to help their brick-mortar stores.
For number enthusiasts: Comscore, as of January, 2012, 101.3 million people in the United States have a smartphone. Thatâs almost one in every three Americans! While Nielsen has this number to be around 43%. Fitch Ratings also predicts that by end of 2012 2/3rd that is approximately 66% of US population will be using smartphones. The U.S. Census Bureau recently announced that eCommerce was responsible for 48.2 billion dollars in sales during the third quarter of 2011.
A market research by Vibes’ shows that mobile technology plays a vital role for in-store shoppers:
84% of shoppers have conducted in-store product research via smartphone
Nearly half of all consumers feel more confident about their purchasing decisions after pulling up additional product information on their mobile phones
33% admitted to searching a competitor’s website for better deals while in-store
6% of consumers said they were likely to abandon an in-store purchase for a competing offer
Interestingly, brick-mortar stores are not doing great despite many of them investing in a good mobile marketing strategy. The reason being – one size fits all approach used by the retailers. Many have just one app that handles the overall retail experience, online presence as well as brick-mortar stores. So, retailers should refocus their mobile strategy and break the overall plan into two parts: Online mobile strategy and brick-mortar store strategy. Both areas have their respective focus areas. One primarily caters to online mobile surfers, while other caters to visitors seeking help in brick-mortar stores. No, it is not necessary to design it as 2 different apps, it could very well be integrated into one app, but design, feature consideration should be specifically designed to also keep in mind the needs of the store visitors. While doing this from the same app, application framework Â and app should be smart enough to identify the traffic. As a starter, it would be a good idea to experiment ans test it using QRCode backed website, which could later be integrated into app strategy if the workflows and used cases are identified and validated.
Following strategy design considerations would go a long way in building a strong brick-mortar retail specific mobile strategy:
1. Include all possible used cases needed by store visitors and wanderers:
It is important to understand what are the most promising features required by users who are surfing the store or wandering around the areas. The used case may include – learning more about products, asking for help, searching for an accessories, price match etc. May be hiring some shoppers or doing focus groups could provide some starting ground. With this savvy data age, I am strictly against focus group, but surely, it could work great as a starting point. It is also important to restrict the research and findings to areas that are impacting store workflows only.
2. Connect Online store with off-line through seamless layer:
Considering expanding mobile landscape, it is important not to lose sight of the bigger picture and the overall mobile strategy. So, seamless connectivity between store specific workflows and online store workflows provides easy maneuverability to users. The goalÂ shouldÂ be to keep users satisfied byÂ fulfillingÂ all their needs and thereby keeping their business confined within store commerce. Certain examples are: providing product availability online, having product shipped to home for free from online etc. So, it is important to compensate shortfalls from one channel with the other i.e. Online workflows and Brick-Mortar store workflows respectively.
3. Provide ability to leave feedback,suggestion, grievances etc.:
Learning is an important part for any business. With evolution in data tools, there is no excuse for anyone not to leverage it for personal benefits. Any possible auto-learning opportunities must be incorporated. A good customer experience management strategy provides list of those surveys and learning manuals that should just be enabled in mobile framework at appropriate workflow touch points. Having done that, retailers will not need anything else but this self learning mechanism to evolve with changing market dynamics. This can lead to sustainable business growth.
4. Reward visitors for enhancing usage:
Certainly, a usage will provide so many other opportunities to stores such as better learning, better chances for referrals, recommendations etc. With that in mind, retailers should provision for some reward system to encourage the use of mobile products and a good design framework should provide mechanics for integrating some reward system. This could be done by providing store credits, coupons etc. It is important that some learning should also be done on which rewards works at which stage.
5. Provide seamless presence and connectivity with other social platforms:
It is not a surprise that there are many other better, reliable local presence social apps being used by users. Some examples being, face book, foursquare, yelp etc. It is important for store mobile strategy to incorporate some alliance with those framework as well. There should be a customized and altered to attract visitors. The sooner stores get in those lines, the more adoption will they receive.
So, get the right gears and move onto building a robust brick-mortar store mobile strategy, that helps stores learn faster and move with changing customer landscapes.
At The Data Incubator, we run a free six week data science fellowship to help our Fellows land industry jobs. Our hiring partners love considering Fellows who donât mind getting their hands dirty with data. Â Thatâs why our Fellows work on cool capstone projects that showcase those skills. Â One of the biggest obstacles to successful projects has been getting access to interesting data. Â Here are a few cool public data sources you can use for your next project:
Publically Traded Market Data: Quandl is an amazing source of finance data. Google Finance and Yahoo Finance are additional good sources of data. Â Corporate filings with the SEC are available on Edgar.
Review Content: You can get reviews of restaurants and physical venues from Foursquare and Yelp (see geodata). Â Amazon has a large repository of Product Reviews. Â Beer reviews from Beer Advocate can be found here. Â Rotten Tomatoes Movie Reviews are available from Kaggle.
While building your own project cannot replicate the experience of fellowship at The Data Incubator (our Fellows get amazing access to hiring managers and access to nonpublic data sources) we hope this will get you excited about working in data science. Â And when you are ready, you can apply to be a Fellow!
Got any more data sources? Â Let us know and weâll add them to the list!