Trending February 2024 # Video Analytics Solutions: A Boon During Covid # Suggested March 2024 # Top 5 Popular

You are reading the article Video Analytics Solutions: A Boon During Covid updated in February 2024 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Video Analytics Solutions: A Boon During Covid

Take a look at some of the use cases to understand the extent of the positive impact made by video analytics

Cutting-edge technologies serve as the go-to solutions during any major disruptions and challenges — and the recent pandemic is no exception. Whether during the peak of a wave or as a measure to safely and efficiently resume operations, organizations all over the world are relying on remote work and monitoring technologies such as AI video analytics. Despite being a comparatively new technology, AI video analytics has opened a vista of nearly limitless possibilities across verticals and use cases. These solutions are now enabling compliance to standards and operational efficiency in a way that was humanly not possible. Conventionally, CCTVs work as the eyes that see everything and store the visuals for later analysis, reference or simply to be deleted after a specific duration. Only in instances where a crime or an unfortunate event or violation of law takes place, the stored footage is used for investigation purposes. However, under normal circumstances, the humongous data volumes are of no use.  

Contact tracing

In the current scenario, this is one of the major ways in which the number of infections can be monitored and new ones prevented. This is where video analytics can make a breakthrough by enabling constant tracking of infected individuals through facial recognition and other aspects. When there is complete visibility of movement and actions, it will become easier to identify risks and prevent outbreaks.  

COVID protocol adherence

As workplaces reopen, there is still a need for continuous monitoring and adherence of certain hygiene and social protocols such as social distancing, preventing crowds indoors, and monitoring of employees for symptoms of illness. Video analytics tools can monitor hygiene situations across the premises and alert housekeeping teams if a need for preventive cleaning is identified.  

Managing occupancy

In times of a crisis like this, video analytics can help and empower organizations – both healthcare and otherwise – to continuously and effectively monitor occupancy. Be it the number of visitors or individuals in specific areas, the technology can enable monitoring of occupancy and ensure adherence to norms that are prescribed. Not only this, video analytics abilities are helping in various other aspects of the business as well. Law enforcement agencies use facial recognition integrated cameras to identify suspects, and healthcare providers are using the technology for contact tracing of COVID-19 patients.  

Avoiding crowds

One of the major reasons for a surge in infections is crowds – in any place. Video analytics can enable in the identification of hotspots and reduction of crowding especially in places where there is an absolute need to do so. For example, the technology can make it possible to determine the ideal number of occupants according to the area and capacity. When the crowd exceeds a required number, it can send a warning or alert to ensure there is no crowding. When deployed effectively, modern video analytics solutions can resolve most of the challenges for public and business management. Today, they are helping in the continuation of commercial as well as social lives and preventing the Omicron wave from going out of control. In the future, the technology will be instrumental in preventing any other waves of COVID-19 or other contagions in India!  


You're reading Video Analytics Solutions: A Boon During Covid

Machine Vision – Now A Boon For Manufacturing Companies

What is Machine Vision? A Shallow Dive Into Deep Learning

I’m sure you must have come across the term ‘Deep Learning’ one time or the other. So let’s have a quick recap on what exactly is deep learning? – Tech giants like Google, Apple, Facebook, and even Toyota have invested in deep learning technology. But it is still hard to find the explanation of exactly what deep learning is, especially in a simple language that everybody can understand. Deep learning is a sub-branch of machine learning that is concerned with neural networks. Neural networks are inspired by our very brains. The network is an interconnected web of nodes (neurons). These networks receive a set of inputs, perform aggressively complex calculations and eventually make use of the outputs to solve a problem. Now, machine vision software when enabled with deep learning algorithms can increase high rates of visual identification and can consequently be used for fault inspection, product-piece positioning as well as automated handling of products.  

How is This Any Different from Conventional Machine Vision?

In conventional machine vision technology, developers need to manually define and verify individual features for the machine in the software. However, due to the cognitive nature of deep learning algorithms, the machines can automatically find and extract patterns in order to differentiate between meticulously detailed and larger components being produced. Machine vision software usually makes use of supervised deep learning algorithms in order to train the machine. A sample of training dataset (with labels) is provided to the machine to learn, identify and distinguish between objects. The system analyses this data and creates training models corresponding to the objects identified. Once a fresh set of testing data (without labels) is inserted into the system, the deep learning algorithm is able to assign a class/ label to the unlabelled data. Due to this allocation of classes, the items can now be identified automatically. The machine vision software continues to learn and relearn on-the-go. Deep learning processes are able to learn new things independently, without manual classification.  

Expert Talk

is a premium software developed for machine vision with an integrated development environment that is being used worldwide. Usually, programming work can be pretty arduous and time consuming for pure manufacturing companies. Highly skilled developers in deep learning are required to build and train the machine vision software. MVtec enables cost savings and improves time efficiency for this market. HALCON’s flexible architecture facilitates rapid development for any kind of machine vision application. A major application for deep learning in machine vision technology can be found in OCR – Optical Character Recognition. OCR is used for precisely identifying letter and number combinations. With deep learning, the typical features of each character are precisely identified based on defined classes. Identifying defects is a time-consuming affair. Especially in the case of tiny scratches on electronic devices. Experts would have to manually feed thousands of images for the machine to catch these tiny scratch defects – this would simply take too long. But deep learning tech can independently learn certain characteristics of defects and define them into corresponding ‘problem classes’. This can be used to identify small paint defects that are not visible to the naked eye.

Pandas Profiling – A Visual Analytics Wonder

This article was published as a part of the Data Science Blogathon.


Pandas’ Python profiling package produces an interactive set of tables and visualizations for exploratory data explor n (EDA). It can be diffic

The Pandas Pr brary allows you to create dynamic, interactive collections of exploratory data analysis (EDA) tables and visualizations with just have an in-depth understanding of multiple packages.

Panda’s df. describe() function is great but a bit rudimentary for serious exploratory data analysis. pandas_profiling extends pandas DataFrame with df.profile_report() for fast data analysis.

Helps create profile reports for Pandas DataFrames

Exploratory Data Analysis

An exploratory data analysis is a strategy to investigate and analyze data sets to gather knowledge visually. EDA is used to comprehend a dataset’s important aspects.

Values, counts, means, medians, quartiles, data distributions, correlations between variables, data kinds, data models, and other information are all found with the aid of EDA. EDA takes a lot of time and calls for numerous lines.

Pandas Profiling is a Python package that can be used to automate EDA. It’s a fantastic tool for making interactive HTML reports that simplify data interpretation and analysis.

Installing Profiling Pandas in Different Ways

Let’s explore Pandas Profiling. Using just one line of code, EDA runs very quickly.

Option 1: Using pip

Install the panda’s profile using the pip package installer if Python is operating on your computer independently. Run the following command in cmd (Windows) or terminal (Mac):

pip install pandas-profiling[notebook]

Pip’s installation covers everything you need. After running pip install, you will see several packages like pandas, client notebook, seaborn, etc. Everything required to produce a profile report is also included.

Option 2: GitHub

Alternatively, you can download the most recent version straight from GitHub.

Option 3: Using Conda

Install the pandas profile library via the conda package installation process if you decide to install the Anaconda package to use Python. Run the following commands in the Anaconda terminal:

conda install -c conda-forge pandas-profiling

With conda, everything you need is installed.

Option 4: From Source

Cloning the repository or pressing the button to download the source code ‘Download ZIP’ on this page.

Go to the directory of your choice and install it by running:

python chúng tôi install

Note: The pandas profiling library is based on pandas, so the version must match the pandas version used by the pandas profiling library. If you have an older version of Pandas installed, the Pandas profiling may cause an error. Update your current pandas installation to the newest version as a workaround in this situation. To force the current pandas to update, return to the console and provide the following command.

pip install --upgrade --force-reinstall pandas

Now your pandas version is up to date

A Case Study on Google Colab

Pandas Profiling Reports – “Basic Building blocks.”

To say that the output of the Pandas profiling library is simple would be an understatement. Alternatively, you can use the following code to construct a general output called a profile report.

To generate a profile report:

Import pandas

Import ProfileReport from pandas_profiling library

Create DataFrame with data for the report

Pass DataFrame using ProfileReport()

Installing the Library – Pandas Profiling

Importing Basic Libraries for Numerical, Visual Data Manipulation

import pandas as pd import matplotlib.pyplot as plt from pandas_profiling import ProfileReport pd.set_option('display.max_colwidth',None) %matplotlib inline

Reading the Excel Data using pandas

df=pd.read_excel('GA NMU.xlsx')

Why Profiling reports are useful

Utilizing iframe() to set up a frame inside the window ()


Sample output with running query

Saving the Output in HTML format

profile.to_file(output_file='Pandas ProfilingReport.html')

For each column, the following statistics – if relevant for the column type – are presented in an interactive HTML report:

Type inference: detect the types of columns in a data frame.

Essentials: type, unique values, missing values

Quantile statistics like minimum value, Q1, median, Q3, maximum, range, interquartile range

Descriptive statistics like mean, mode, standard deviation, sum, median absolute deviation, coefficient of variation, kurtosis, skewness

Most frequent values


Correlations highlighting highly correlated variables, Spearman, Pearson, and Kendall matrices

Missing values matrix, count, heatmap, and dendrogram of missing values

Text analysis learns about text data blocks (ASCII), scripts (Latin, Cyrillic), and categories (Uppercase, Space).

File and Image analysis extract file sizes, creation dates, and dimensions and scan for truncated images or those containing EXIF information.

Several Segments are Available in the Pandas Profiling Report


A portion of the more than five pages of data and visualizations are shown above. This is a rudimentary implementation, as was already stated. The report’s title was the only optional addition (not shown in the image above). The Toggle Details widget is visible. A list of specific details is displayed when a user taps the widget (button).

General information is provided in this section. Variable kinds and data statistics.

Record statistics display columns, rows, missing values, etc.

The variable type indicates the data type of the record property. A “warning” that lists the functions with a strong link to other functions is also displayed.

Variable Section:

Detailed information is provided in this section for each feature individually. When you select the Toggle Details option, as indicated in the aforementioned image, a new section will be displayed.



The Seaborn heatmap is used in this section to illustrate how the features are related. Change between various correlations, including Pearson, Spearman, and Kendall matrices, easily.

Missing Values:


This section displays the First 10 Rows and the Last 10 rows of the dataset.

Profiling Report for Pandas: Advanced Options

Numerous options and sophisticated parameters are available in the pandas profiling report. The visual output of a report and the specifics of each chart and visualization are all controlled by settings.

settings you should know

title: The title attribute sets the title of the report. This optional attribute is set when the profile report is created. An example is shown in the third line of code above.

to_file(): The profile report is produced as an HTML file that may be stored outside the Jupyter notebook. The created profile report is an HTML file, so take note of that.

EX : profile.to_file(“flights_data.html”)

The settings can be used in two different ways. When creating a profile report, the first option applies modifications as extra characteristics using a dictionary. The second choice again defines key-value pairs using a dictionary and navigates to the necessary parameters using dot notation.

There is a minimal view from when Version 2.4 was introduced in the minimal mode for large datasets.

This default setting turns off costly calculations (such as correlation and duplicate).

from pandas_profiling import ProfileReport profile = ProfileReport(df, minimal=True) profile.to_file(output_file="output.html") Conclusion

We hope the Pandas profiling library will help you analyze your data faster and easier. What do you think of this wonderful library? Try it out and let us know about your experience in the answers section.

Able to process


data sets

with minimal visual


This library works

great even





Most other IDEs, including PyCharm and Jupyter Notebook, are compatible with Pandas Profiling.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.


Huobi Global: Providing Profit Opportunities During A Crypto Winter

Since the crash of algorithmic stablecoin LUNA in early May, the crypto market has seen over US$45 billion lost in value, shattering the dreams of thousands of investors, some of who invested their life savings into what they thought was a safe haven. 

Last week, crypto fund assets fell to their lowest point since July 2023, and Bitcoin (BTC)  fell to a low of US$25,892 from a high of US$47,444 just two months ago. 

Terra founder Do Kwon has since made efforts to recompense investors by creating a fork of the Terra blockchain through the construction of a new chain that does not include the UST stablecoin project. Naming the original chain as Terra Classic, Token LUNC, and the newly forked chain as Terra, Token LUNA, Kwon has been trying to rebuild the Terra ecosystem. 

An initial sum of airdrops for the new LUNA tokens have been made to LUNA’s original token holders, and a plan for long-term, yearly distribution will be made over time in an attempt to compensate users based on the losses they have suffered. Since efforts from Kwon and various market players to stabilize the crypto space, BTC has recovered and stabilized at around US$30,000, with prices of another layer 1 tokens including ETH and SOL reaching stable levels as well. 

During the market crash and time of immense volatility, Huobi’s first response has been to protect its investors, and offer other opportunities for trading profits. Shortly after the crash, Huobi’s derivatives and spots trading platforms delisted LUNA to protect against further damage to traders; as Terra 2.0 launched its recovery plan and airdrop, an immediate re-list of the new LUNA token was done to provide certain levels of compensation to the original investors. Since then, Huobi has also introduced new strategies and rewards backed by Huobi’s own liquidity to help investors break profit, even during a crypto winter. 

Exciting Airdrops and Rewards

Earlier this month, Huobi Global introduced new innovative products and strategies that offered rewards of US$31 million a month. The series of promotional events aimed to attract more investors to the crypto space at all levels of financial knowledge and is the opening of a series of campaigns by Huobi such as PrimeList and PrimeEarn. 

In March of this year, Huobi also launched the CandyDrop campaign, a promotional event that offered free token airdrops every day and a $4.34 million USDT prize pool in the past month. Results of the event were strongly positive, with a total of 1.25 million people participating, and over 320,000 participants successfully receiving airdrop rewards. One individual user won prizes worth 2,470 USDT based on the highest value on the day of listing.

Competitive Investing Strategies

While exchanges stand to gain through high trading fees and lower returns on deposit products, Huobi has worked to bring competitive rates that may help its users yield a better profit on deposit and derivative products. The debut of PrimeEarn, crypto-asset management, and deposit product, offered users notably high APYs for fixed deposits of mainstream assets. Users gained a chance to win up to 30% APY for staking mainstream crypto assets such as BTC, USDT, and ETH when they participated in Huobi PrimeEarn High-Yield Tuesday events. 

The 20%-30% APY offered on USDT and ETH deposits with PrimeEarn are the highest rates offered on the market by crypto-asset exchanges today, as a result of Huobi’s dedication to providing competitive rates to its users, and are much more lucrative compared to rates of 7% on average for mainstream assets offered by exchanges such as Binance. As for PrimeEarn, the deposit pool accumulated throughout the past 5 events stands at a total of 660 million USDT, and rewards for users who participated in group competitions reached as high as 5 million USDT. 

Aside from deposit products, Huobi also hosts competitive rebates for derivatives and became the first exchange to offer maker fee rebates for USDT-margined futures to all users, at a rate of 0.015%. 

With more attractive returns and APYs, Huobi hopes to attract more users of all experience levels. Crypto-derivatives in particular can prove to be a beneficial way to safeguard investor pockets, especially during volatile market conditions, as they can diversify portfolios and offer sophisticated strategies. With trading products such as Huobi Options and Huobi futures, traders can speculate on future prices of an underlying asset for income at a much lower cost than the asset itself, or hedge the risk exposure of their existing positions.

An Expanding Global Market

Even during a market downturn, Huobi capitalizes on its mature corporate ecosystem to expand services in emerging markets and builds on its compliance capabilities by acquiring new licenses and regulated service entities. In recent times, Huobi announced the acquisition of Bitex, a regional cryptocurrency exchange with operations in Argentina, Chile, Paraguay, and Uruguay. This is the first in a series of expansionary investments and acquisitions planned for 2023, each of which is designed to accelerate Huobi’s global growth. 

During the crypto winter, Huobi stays bullish with expansionary strategies and aims to help more users across the globe find new opportunities for trading, ultimately supporting the long-term growth of the cryptocurrency industry. 

How To Avoid Seo Disaster During A Website Redesign

The word “disaster” probably got your attention and in this case, it’s not the least bit hyperbolic.

Website migrations that don’t account for SEO or are undertaken with hasty and/or weak plans for SEO can be disastrous.

If you’re doing a website refresh, migration, and/or relaunch, you likely have solid reasons for doing it and expectations for improvement in specific areas like UX, brand perception, and conversion goals.

SEO is essential for helping you achieve all of the above.

In this column, you’ll find important pre-launch and post-launch steps to ensure that SEO is prioritized in your relaunch so you can avoid disaster and enjoy the best possible outcomes for your business.

Work through this piece as a checklist as you plan and execute your website redesign.


Before Migration

Think about what it is you’re hoping to accomplish with this investment in reimagining your website.

Whatever else you hope to gain, you definitely want to protect the value and equity your site has built up over time. You don’t want to lose current rankings, link value, or traffic — and you don’t want to spend months to recover or rebuild them, either.

On top of that, you likely also want to improve organic search performance.

Goals & Project Plan

There are likely some pretty compelling reasons why you are embarking on a website redesign.

Those could be tied to:

Business use-cases.

User experience improvements.

Marketing initiatives.

SEO improvements.

Setting appropriate goals is a key aspect of ensuring the project’s success.

Establish your baseline and benchmarks, as you’ll want to be able to confirm improvements and ROI on the project.

Most web projects follow a specific plan or agile methodology. This plan is typically managed by the project owner whether they are in an account service, project management, product, IT, marketing, or some other aligned role.

Make sure your plan accounts for SEO at every step so you aren’t surprised by any unintended consequences.

Content & Information Architecture

Both the context of the website subject matter overall and sub-topic your themes are critical to SEO success.

This affects the specific content on the site and how that content is organized.

Changes to the information architecture, sitemap, and overall content plan in the redesign can impact SEO, and ensuring that everyone on the project understands how is important.

You want to make sure pages valuable to your SEO strategy are not omitted from the site going forward, and that the overall message and theme of sections of the site are not diluted by the design.

Use a crawling tool (e.g., Screaming Frog or DeepCrawl) to find all pages of your website.

Then, with your team, work from your current sitemap on requirements for the new one. This will be your guide through the rest of the steps in the redesign process for SEO.

On-page Optimization

Digging deeper to the page level, it’s important that you maintain the relevance of content to the intent of your target searcher.

Once you know what context and overall architecture is changing (or remaining the same), you can work to protect or proactively optimize at the page level for the specific elements that help with relevance. This will include URLs, page titles, meta descriptions, body copy, alt text, etc.

How deep are the changes to your architecture and sitemap? This will dictate how much you need to focus on the relevancy of content to ensure you don’t lose subject matter content on the site.

Optimize your staging site or code-base; do not wait until post-launch to make these updates.


There’s a basic user experience case to be made for ensuring that you map out 301 redirects for all pages that are getting new URLs and those that are going away in the redesigned site.

We never want to serve up a 404 error page to a user if we can avoid it.

Search engines are okay with 404s if we are hoping to have content removed from the index. However, any link equity you’ve will be lost if backlinks to your site result in a 404 error.

Ensure that all pages with links pointing to them are (at the very least) properly redirected, especially if you don’t have control over ensuring that the links are updated to the new destination page URL.

If you have a large website, this could very well be the most time intensive and important part of the redesign SEO process.

Use the crawl that you did earlier for your sitemap planning to determine all URLs that need to be redirected.

You can also gain a lot of insight from Google Search Console and Bing Webmaster Tools as to which pages the search engines crawl on an ongoing basis, to make sure you don’t miss any redirects.

When you have all redirects mapped out, ensure that they are implemented at the server level or in a site plug-in or utility and are ready to go for launch.

It is difficult to do this work after launch, as the damage has often been done with search engines and users getting 404 errors. Every passing minute, hour, and day equals more lost opportunity here.

Even when your SEO professional can use forensics to find the old site URLs and implement redirects, you’ve lost precious time. You’ll have to take the short-term hit and hope to get back the relevancy and authority status you had with the old website.

At Launch

At launch time, follow along with the go-live checklist and perform any possible quality control checks of the work you have done on the staging site to date.

Don’t give the go-ahead for launch if any of your on-page work or redirects are not in place or tested.

It is much easier to slightly delay launch than to undo damage later or – exponentially worse – to have to roll back to the old website (ouch).

Post-Launch Check Redirects

Your first step is to go back to your redirect file, old sitemap, and old site crawl to test and ensure that all old site URLs perform 301 redirects to new site URLs, as you intended.

This is where you can catch any stray 404s and implement additional redirects quickly.

Begin by spot-checking URLs and then go deeper as time permits to work through as many old site URLs possible.

Also, check for other sneaky redirects like 302s or server methods to ensure you have clean 301s.

Dev-to-Live Audit

Make sure that all pages and specific on-page optimization carried over from the dev site to the live site.

This is particularly important for websites with a lot of dynamic content, as sometimes databases and tables get missed in the migration.

For example, if you optimized all title tags on the staging site but the database they are in didn’t go live at launch, you might find missing or default duplicate titles on every page or on product pages, etc.

Code & Performance Validation

Don’t assume that the live website will perform the same as the staging site did.

Run the homepage and key pages through the mobile-friendly testing tool or Lighthouse audits to ensure the site achieves passing grades.

Also, don’t forget about any schema markup you have on the site.

Using validation tools to ensure proper implementation here are helpful in case anything changed between the old, staging, and live sites and how the search engines render them.

Submit XML Sitemaps

Once you are satisfied with your redirects working properly and the implementation of SEO on the live site, it is time to submit the XML sitemap.

Ensure that the dynamic sitemap includes the desired full set of destination URLs.

If you are using a static sitemap, generate a new one now, audit it, and submit it.

Note that you want to be sure that your XML sitemap file(s) are pristine. You want to have zero URLs resulting in 404 errors and ensure all URLs are the destination URLs versus redirects or pages that canonical to another version.


It feels good to be done with the hard work involved in SEO for the relaunch and the migration overall. Now, it’s important to shift your mindset to a monitoring phase.

For the next 1-2 months, closely monitor Google Search Console and Bing Webmaster Tools to watch for reported 404 errors, crawl errors, and any HTML on-page issues detected. Tackle these quickly.

Ongoing SEO

Remember that SEO is not a one-time thing.

Once the dust has settled and the monitoring phase is in motion, you can go back to your original plan and goals and measure the performance of the new site.

From here, you can resume your normal ongoing optimization plan.

Summarized Steps

Here’s the short list of what we unpacked in this article.


Goals & Project Plan.

Content & Information Architecture.

On-Page Optimization.




Check Redirects.

Dev-to-Live Audit.

Code & Performance Validation.

Submit XML Sitemaps.


Ongoing SEO.

Account for some steps taking longer than planned or having to add some as you go. Web projects and SEO don’t always go according to plan!

Beyond that, make sure you take every opportunity you have to use the redesign and relaunch for performance improvement to maximizing the return on your investment in your site.

More SEO and Website Design Resources Here:

Boon From Big Data Or Loss Of Privacy?

Today’s post is going to be different.

There is no technical subject matter I am going to talk about. But the article is far more thought provoking than any of the article I have written till date.

[stextbox id = “section”] A real life incident:[/stextbox]

Let me start with a real life example to get your thinking process started:

About 6 months back, I bought a top end Android smartphone. After using it for a month or so, I accidently started Google Now on the phone. The interface looked very simple on first look (nothing more than a search bar and weather update). So I moved back from the application and started living my usual life.

I would have almost forgot this instance like multiple other applications which come with the phone and I don’t use. However, Google had something else in store. A week after I opened the application for first time, I got a notification on my home screen, suggesting that I am 15 minutes away from Home and the traffic on route is normal!

The notification took me by surprise. I never told my phone where my home is! Over the next few days, the application identified my Office, commute place, friends place, the websites I visit frequently. It now integrates my searches across devices. So if I search a restaurant on my laptop, my phone shows me the route to same restaurant!

The incident above is like a dream come true for a lot of analysts and a scary incident leading to loss of privacy to a lot of customers.

As an analyst and some one who specializes in predictive modeling, I am usually a proponent of big data and the changes it is bringing to our day to day life. However, I have to admit that Google took me by surprise and has made me think and reflect a lot more on how life is changing. It has ensued a debate between 2 sides of my personality.

[stextbox id = “section”]Two sides of debate:[/stextbox]

My first personality is that of a common man. I want my privacy, specially during some personal moments. These moments could be the time I spend with my family or when I am reading or may be talking to a friend. I don’t want interruptions or suggestions from any third party during this period. I want to relish the moment as it is. After going through the experience mentioned above and many more like that, I am not sure whether these moments will remain as pristine and unadulterated as I would want them. Would my reading experience be marred by suggestions about different things I might like? Would the phones pop up notifications about my friend when I am talking to them? or may be when I am talking about them to my wife? The possibilities are limitless!

The other side of my personality is a big proponent of technology and Analytics. I remain excited about how technology can be used to solve day to day problems. I come out with innovative ways of using data to create value (for customers as well as Organizations). I continuously think how behavioural modeling can help customers in breezing through day to day chores? How can I predict something before it actually happens.

[stextbox id = “section”]How do we resolve this?[/stextbox]

The second personality needs to be cognizant about the presence of first personality and take actions which are in sync with values of first personality. Here are some rules I have come out with, which every analyst needs to keep in mind while designing a product or working on his next big data project:

[stextbox id = “section”]1. Transparency:[/stextbox]

This is the biggest takeaway. The bare minimum an analyst needs to make sure is that the customer is aware about what data is being collected and how can this be used. This needs to come out clearly. This is similar to apps (on smartphones) asking permissions before installing them. If you are collecting data with out asking customer explicitly, you are headed for disaster.

So, instead of using data through a pre-selected tick box (buried somewhere is my phone settings), I would have appreciated if the app reminded me of the data it will use, when I started it for the first time.

[stextbox id = “section”]2. Develop a character of your Organization by keeping customer at the heart:[/stextbox]

Let me try and explain. Years before Google started collecting information about usage from Android phones, Microsoft started this for MS Office. They asked me whether I would want to share my usage patterns with Microsoft, which will help them improve user experience further. I almost always declined. When Google asks me same thing, I am more open to sharing information.

It might be a personal choice. However, the reality is that I am more open to sharing data with Google because I can relate to the benefits they have provided me by using this information. I have benefited by sharing some of this information with Google.

The message is that if you don’t provide the benefit of this information back to the consumer, they will stop sharing this information.

[stextbox id = “section”]3. Make change in subtle manner:[/stextbox]

Big changes in user interface or the way new product gets rolled out can take customer by surprise. You have to build in these changes in subtle manner. In a way such that the customer still feels as much at home as possible. I think Google does a nice job at it. Here are some best practices:

Provide an option to user to switch back to old proposition, if it is not working for him

Try and keep as much user interface unchanged as possible.

[stextbox id = “section”]4. Test and roll-out:[/stextbox]

Irrespective of how good an idea is, you should avoid making complete roll-outs without testing. There are multiple benefits from this:

You actually act based on how customer feels about the product

You can size the benefit / loss you have seen by moving to a new product.

I think until and unless Organizations and analysts adhere to these rules, it might only be a question of time before they face a bunch of disgruntled customers.

If you like what you just read & want to continue your analytics learning, subscribe to our emails or like our facebook page.


Update the detailed information about Video Analytics Solutions: A Boon During Covid on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!