Mastering the RAG Architecture: A Scientific Approach to Building Domain-Specific Chatbots

In today’s fast-paced Large Language Models (LLM) landscape, the Retrieval Augmented Generation (RAG) architecture emerges as a game-changer. RAG is a novel architecture that enables the use of LLMs like GPT-3.5/4 or LLAMA to build domain-centric chatbots without the need for expensive fine-tuning. It employs clever techniques to identify relevant contexts from the data, which can then be passed to the LLMs to synthesize answers. While it has been instrumental in several notable production use cases, including our own Eryl product under the GeneraX umbrella, the journey of RAG’s mainstream adoption is only just beginning.

At Affine, we don’t just adopt technology; we sculpt it. We have adopted a scientific approach to harness the capabilities of the RAG architecture for building production-grade LLM customer solutions.  This includes our Eryl product, showcasing the manifestation of our philosophy—implementing scientifically engineered solutions that resonate with individual customer requirements.

The RAG’s efficacy pivots around various design parameters. But how does one ensure peak performance? For us, it’s about a rigorous, scientific approach. We have borrowed significantly from the concept of hyperparameter tuning for Machine Learning and Deep Learning models. We systematically navigate these parameters, evaluating their performance on real-world test data – such as customer interactions in chat sessions that have received high Net Promoter Score (NPS) ratings, an industry-standard metric for customer satisfaction.

When it comes to building scalable, production-grade, and hallucination-free LLM applications, the key objectives are not only the accuracy of outputs but also factors like latency and cost of inferences. We evaluate the performance of all hyperparameters on all these factors and select or fine-tune iterations that rate high across all success factors.

Listed below are some RAG hyperparameters we utilize while developing LLM applications:

  1. Chunk Management Related: At the heart of RAG’s contextual retrieval lies a matrix of parameters – chunk size, overlap window, and top K chunks for retrieval (meaning the top K most relevant text chunks that are retrieved). Much like deep learning tuning, we employ an iterative but optimized methodology to discern the most effective combination.
  2. Embedding Model Fine-tuning: Fine-tuning the embedding model ensures the domain specificity of embeddings, thereby allowing retrieval of relevant chunks from the vector databases.
  3. Generator LLM Fine-tuning: By refining the synthesizer LLM on specific customer documents, it becomes attuned to unique nomenclatures and keywords. Given that this LLM steers the response synthesis, generating the final text that the end-users interact with, alignment with customer-specific lexicons is pivotal.
  4. Enhancement with Knowledge Graphs: Incorporating Knowledge Graphs with RAG becomes a force multiplier, especially for intricate, multi-contextual, or multi-hop queries, where the model needs to consider multiple factors or steps to generate an accurate response.
  5. Hard cutoff on Cosine Similarity: The conventional method of selecting Top K embeddings may still result in hallucinations, as for certain queries, none of the top K chunks may be relevant. In such cases, it is essential to have a hard cutoff on cosine similarity that only fetches chunks above the threshold.

Our approach involved systematically iterating through various combinations of the above design parameters in an optimized fashion and evaluating the performance on test data. It should be noted that iterations involving fine-tuning embeddings or generator LLM models can be computationally expensive and should be undertaken only if the development budget allows.

The following capture key performance metrics and other ML/LLM hygiene practices that we adopt in building the LLM application:

  1. Performance Metrics: Our benchmarking isn’t just about accuracy. By analyzing real human chat logs with high NPS scores, we gauge efficacy. Additionally, parameters like latency and cost of inferences help construct a system that’s precise, economical, and prompt.
  2. Optimization within Boundaries: Despite the computational complexity, especially when fine-tuning the embedding and generator models, we ensure that development remains within budget constraints, thus achieving a balance between performance and cost.
  3. Systematic Record-Keeping with MLOps: Tools like MLflow are invaluable, enabling us to meticulously document all iterations, providing a robust framework for tracking changes, and ensuring that the model can be easily deployed or rolled back as needed.

The culmination of these steps results in an LLM solution that’s not only primed for production but also accurate, cost-effective, and systematically built, ensuring reproducibility and reusability.

In summary, the RAG architecture isn’t merely an innovation in building QnA systems; it’s a game-changer in the realm of large language models. By enabling specialized chatbots to leverage the power of LLMs without the need for expensive fine-tuning, our Eryl product exemplifies how the intelligent use of LLMs, enabled by RAG, can yield a product that is not only cutting-edge but also finely tuned to meet distinct customer needs.

At Affine, we don’t merely adapt to technology; we shape it, refine it, and make it our own. We continually integrate groundbreaking technology into our ethos of delivering scientifically engineered solutions, creating products that are not just innovative but also tailor-made to tackle real-world business challenges head-on.

As we continue to advance in this journey, the RAG architecture stands as a cornerstone, showcasing the incredible potential and adaptability rooted in the synergy between retrieval and generation techniques in LLMs. We aim to go beyond just building chatbots; our vision is to build intelligent systems that can understand, learn, and adapt, setting new standards for what is achievable in the realm of artificial intelligence.

Announcing GeneraX – Affine’s Generative AI Product Suite

Affine has a rich legacy of developing AI-powered solutions. Right from its inception, there has been a strong emphasis on not just developing superior quality solutions but enhancing our learning curves and innovation opportunities. This approach helped us open up new avenues to solve business problems effectively. Thus, it has been the single most important differentiator, allowing us to build production-grade AI solutions for several global businesses.

Our accolades from global AI hackathons across multiple industries are a testament to the depth of knowledge we have in AI while signifying our advanced practices. It should be noted that in hackathons like Datacentric AI, Hackerearth, and Kaggle hackathons, we were the only AI company that made a spot in the top percentile among dedicated academic researchers in the field.

In the World of NLP:

Affine’s mastery in leveraging Transformer technology is displayed well in our NLP solutions. We were able to combine our Deep Learning expertise with open-source technologies like BERT, RoBERTa, etc., to deliver ground-breaking solutions that helped organizations reduce a significant amount of manual effort and deliver more accurate results. Some of the most recent solutions we developed were – Document summarizer, Context-based enhanced search, and Contextual AI Chatbot. You can contact us to know how these solutions can help your business.

In the World of Vision:

Specialization in Stable Diffusion matured during the development of our Satellite Image Segmentation product – Telescope. We used Stable Diffusion to create synthetic data that could be used to train the Image Segmentation Model. Telescope was thus developed with the intent to save millions of dollars and months of effort that would go into land surveys in multiple industries. We also created a mechanism using GAN models to create new gaming characters.

The Upcoming Generative AI Product Suite – GeneraX

The last few months have witnessed the widespread adoption of Generative AI, such as Open AI’s GPT in text generation, Dall-E 2 for image generation, and Google’s Bard chatbot. Despite some limitations, these AI implementations are revolutionary and provide excellent results. However, they are not completely business ready. A significant effort is required to ensure that these implementations give professional-grade, meaningful, and usable outcomes to businesses.

The grueling hours of learning the in-depth working of different AI technologies have always been guided by our intent to build the best real-world solution that could be used and benefit businesses. Affine’s knowledge of how things work under the hood is coming together with GPT 3 and Dall-E 2 to create enterprise-level SaaS products. The GPT and Dall-E APIs have helped us speed up development, give wider scope and convert the boutique solutions we pride ourselves on into plug-and-play products.

We’re kicking off our Generative AI product suite – GeneraX – with CreAItive!


“Are you a marketer frustrated with the prolonged ideation of designing creatives? And you spend hundred-thousands of dollars to create marketing-ready creatives and get only a handful of variations. It’s time to get over this creative generation cycle. Introducing Affine’s Image Segmentation and Stable Diffusion powered CreAItive. It’s a one-stop-shop for design ideation, experimentation, and creation of 100+ market-ready images on the go at a fraction of time and cost.”

Are you ready to scale up your business with the power of AI? Watch out – this space for demo links and to gain access to the early adopter benefits on GeneraX!

For a product demo, contact us today!

What is Web3? What are its Use Cases?

In recent years, we have witnessed a massive shift towards digitization across various industries, from finance to healthcare, education, and entertainment. Digital Transformation has brought numerous benefits, such as convenience, efficiency, and accessibility. However, it has also created new challenges, such as centralization, data breaches, and privacy concerns.

Here comes Web3! It’s a new generation of the internet that promises to address these challenges by leveraging the power of decentralized networks. In this blog, we will explore the exciting world of Web3 and its potential to revolutionize how we interact with cyberspace. So, buckle up and get ready to uncover the future of decentralized digitization with Web3!

What is Web3?

The current web we use to access and share information is the 2nd generation. In the 2nd version of the web, the content we produce is saved in a central server controlled by an authority. Various data ranging from emails, health tracker data, shopping interests, social media posts, photos, entertainment, and choices to web browsing patterns and other forms are the data collected on a regular basis from the user and saved under a centralized service provider storage where users have no control over their data.

The true ownership of this data has never been owned by the user but rather by the central authority controlling the service. Web3, which is the 3rd generation of the web, will solve this critical ownership problem by shifting the control of content from central authority back to the users. Users have complete control over what they share and with whom they share and can completely revoke the permissions at any time. Web3 is all about less trust and more truth.

How will Web3 be different from Web2?

The real necessity of Web3 – Let’s look at real-life use cases that have facilitated the design thinking towards web3:

Use Case 1:  Many of us have played or heard of the popular flash-based game called Farmville, which was designed by Zynga on Facebook. In 2020 after 11 years of service, the development has been ceased leaving millions of fans of the game unable to access the game assets they’ve purchased over the years. Web3 can solve this problem by transferring the ownership of those assets as limited-time collectibles to the fans who bought them on an open decentralized marketplace.

Use Case 2: The fundamental problem that occurred when the popular social media site Orkut got shut down, resulting in millions of users losing access to their photos and posts shared over the platform, which are actual memories from the early days of the web in the 2000s. Web3 can solve this problem by bringing back the control of user data (posts, media) to the users and freedom to take the data to their platform of choice by making it interoperable.

Use Case 3: Free speech is a powerful principle of democracy that should be censorship resistant. There are many cases of social media accounts getting banned just because of criticizing authority of its flaws even though when it’s the truth, which indicates the suppression of the free flow of open speech. Essentially the accounts have been permanently locked in their previous posts on social media. A web3 based existent decentralized social media platform like Mastadon solves this problem where users can control the data they publish and interoperate with other platforms of their choice where there should always be one single source of truth that is censorship resistant.

What are the benefits of providing access to user data?

Healthcare data, for instance, can be shared with various medical sources for advancements in medical research, where the data exchange will be peer-to-peer. Our photos & media, meanwhile, can be permitted to be uploaded to Facebook, Instagram, Flickr, etc., without uploading individually. And the most important aspect of any web3 application should be the incentive structure the user can benefit from companies accessing their data. Users by choosing and providing access to their data should be incentivized for the contribution, which is clearly lacking in the web2 world.

Is Web3 based on blockchain?

One of the misconceptions most people believe is that web3 is completely blockchain-based. But the truth is that web3 is a culmination of technologies, whereas blockchain is a mere part of web3. For instance, we imagine blockchain like Bitcoin/Ethereum provides a solid trustless, permissionless cross-border payment between individuals without any central banking authority to control the transaction. Blockchains are excellent use cases for web3 where public platforms like incentive structure, decentralized access, decentralized finance, NFTs, and DAOs can be built to support the principles of web3 ideology. Even standardized technologies can be part of a web3 application development, given it implements basic principles of user privacy, ownership, and censorship-resistant data flow.

Web3 and Gaming Applications

As we see a trend towards adaptation of web3, we will see more games built around incentivizing the users. Game designs will make use of releasing limited game assets as collectible NFTs to its fans, thereby making them a partner in the development process and creating a win-win scenario when the game performs well for both the companies and fans alike. Users can be assured that they will still own the game assets as collectibles even though the game shut down in the future.

Web3 and Defi (Decentralized Finance)

The true potential of Finance will be unlocked when more financial products are implemented around the principles of Web3 and Decentralized Finance. Already existing applications like Uniswap and Airswap have taken the first steps in the evolution of Web3 financial products. Imagine finance becoming peer-to-peer between any two parties in the world where the transaction rules are governed by a contract running on a trustless network autonomously. This removes a whole lot of unnecessary paperwork and intermediatory fees and, most importantly, saves a lot of time for instantaneously accessing various financial products, even in remote places of world where banking is a luxury. Decentralized cross-border payments are the future.

Web3 and Metaverse

The Metaverse is a digital platform that provides an immersive experience to users using AR and VR technologies. We can view this as a 3D web where users can have 3D interactions with other users, bots, and applications. Metaverse as a platform will be there for enhanced social connections. Imagine Facebook as a 2D place where you can add a friend, chat with someone, join a group, etc. The same actions can take place in Metaverse in 3D with enhanced user experience and social connections. Web3, in some ways, will be a component of this digital social experience by powering apps that are censorship resistant, decentralized, and secure.

Web3 and AI

Eventually, AI is the umbrella term where the full potential of Web3 principles comes into play. By owning the data in various forms, users will have complete control over who to give access to, thereby getting an incentive for doing so. Imagine companies building AI models having access to the same reliable and quality data from real users who are willing to participate in their development activity. The users have the right to control the information to share and get incentivized, and the companies have access to golden data to build better AI models which perform well than the ones trained on noisy data. Web3 principles will govern the flow and access of this data by creating a more inclusive environment.

Summing up!

Privacy by design and default, less trust and more truth, whereas decentralized and censorship-resistant ownership is one of the principles of any future Web3 application. An ecosystem where humans/bots/ devices/applications can securely operate on a trustless network can be enabled by following these principles. While Web3 is primarily a concept under development today, some early applications demonstrated its implementation, such as Odysee, a decentralized video-sharing app, and NFT marketplaces where users have the freedom to sell an NFT on a platform of their choice by just connecting their wallet, Mastadon Social Network, etc. In Web3, we can even imagine building decentralized machine learning models that can perform more efficiently.

How will Artificial Intelligence Transform the Business Landscape in 2023?

Over the last two years, businesses of all sizes across the world have embraced AI in various forms and seen a tangible outcome. As a result, Artificial Intelligence is expected to make significant advancements considering the massive investment and continuous innovation that has occurred in the last couple of years, with the potential to significantly improve our lives and the way organizations work in the digital transformation landscape.

AI has already revolutionized many industries, from healthcare to finance, and its applications are only going to grow. Accelerated AI automation has seen the most advancement in the recent past, especially in Generative design AI or AI-augmented design and Machine Learning code generation. We can expect AI-driven automation to power businesses to make better decisions, reduce costs, and increase efficiency.

AI-powered robots and autonomous cars are providing us with a new level of convenience. AI technology drastically improving healthcare delivery and becoming more integrated into our lives has grown into an essential part of our day-to-day life. The next phase of AI is going from narrow-scope to wide-scope ensembles. This will also be the time when AI governance and security will be developed, scrutinized, and standards will be set. We are heading into an era where AI engines driving decisions in silos for different business functions will be ensembled and synchronized for maximum efficiency and profitability at an enterprise-level.

Generative AI will gain prominence!

Generative AI has become the buzzword of recent months, with applications like ChatGPT taking the world by storm (it crossed 5 million users within five days). In the context of such AI models—where computers generate text rather than simply copying it from other sources and rearranging it to form new sentences—ChatGPT illustrates how generative technology will grow more ubiquitous as time goes on. With the advent of generative AI technology, it’s possible to create not just text but also images, videos, music and even entire websites. The usefulness of this technology lies in its ability to automate content generation, provide personalized content and generate a high volume of quality material. In 2023, we can expect generative AI apps to accomplish even more.

With new technologies, we often face challenges even greater than those faced by previous generations. We expect that scalability, privacy and security issues will arise as well—and, of course, copyrights. For AI to become the next creator, it will have to take on some of those roles itself—and that means addressing ethical concerns around how machine learning models are trained. Industrial enterprises must set up frameworks that enable the democratization of information. The scope of Generative AI is large enough to warrant monitoring these challenges closely.

Advances in AI will lead to a rise in AI governance

As enterprises adopt more AI technology, this will result in better data governance practices – mainly driven by increased awareness among the public and regulating authorities. The burgeoning application of Artificial Intelligence has outpaced attempts to create a framework for regulating it. As a result of increasing public concern about the impact of artificial intelligence on society, we can expect to see more countries implement regulations such as the EU Artificial Intelligence Act, data and policies (GDPR) in order to protect citizens.

As enterprises ramp up their use of AI, they will need to assess the potential risks involved and incorporate ethical standards in their strategies. Ethical use and governance of AI models/ tools will be critical for all enterprises deploying them.

AI can help businesses detect and mitigate cybersecurity risks

AI will be instrumental in helping organizations implement proactive cybersecurity measures. By anticipating and preventing existing and emerging threats, AI will create a shield against any potential dangers.

As the number of cyber-attacks has increased each passing year, so too has their complexity. Responding quickly to these concerns in real-time is critical and the need of the hour. But how can you use all that data effectively? Machine learning models can learn from vast amounts of information quickly and respond to changing patterns; Artificial Intelligence will help increase efficiency through automation as well as allow experts better allocate resources toward more pressing problems.

The current decade will unfold full-fledged AI ecosystems

According to Gartner’s hype cycle on emerging technologies, cloud sustainability and cloud data ecosystems will reach the “Plateau of Productivity in the next five years while different Accelerated AI Automation (like Casual AI, Foundation models, Generative design AI, ML code generation, etc. will reach “Slope of Enlightenment” and start moving into “Plateau of Productivity”.

This means that in the next five years, we will see organizations will consciously start replacing the stand-alone AI engines making localized decisions with a wholesome digital ecosystem. The ecosystem is housed in the cloud, operated by Automated AI systems and interacting with business stakeholders and users via immersive technologies and blockchain-based transactions.

A retail customer would no longer be limited by traffic congestion, parking availability or distance to the store to try out merchandise in the virtual reality store. A surgeon’s exceptional skills could be deployed miles away in an area of need without having to wait for the duration of a flight, saving countless lives. Construction and infrastructure development can be tested to ensure stability with great accuracy and high speed with agile adjustments.

This sounds a little sci-fi, but the technology for the future is ready now. It just needs to be brought together.

In a nutshell

We are moving towards the integrated AI ecosystem panning across all facets of our daily lives and every business at an unstoppable pace. The way businesses interact and transact with consumers is going to be revolutionized with this ecosystem. Entire new business models are being created around this technological evolution impacting organizations of all industries and sizes.

While enterprises and consumers are getting more and more familiar with different AI interactions, architects and engineers will get onto the trend where they will be “parenting” AI engines on what to do, how to do it, how well it learns and how well it functions.

Decision Intelligence: The Next Big Milestone in Impactful AI

As businesses take a global route to growth, two things happen. First, the complexity and unpredictability of business operations increase manifold. Second, organizations find themselves collecting more and more data – predicted to be up to 50% more by 2025. These trends have led businesses to look at Artificial Intelligence as a key contributor to business success.

Despite investing in AI, top managers sometimes struggle to achieve a key benefit – enabling them to make critical and far-sighted decisions that will help their businesses grow. In an era of uncertainty, traditional models cannot capture unpredictable factors. But, by applying machine learning algorithms to decision-making processes, Decision Intelligence helps create strong decision-making models that are applicable to a large variety of business processes and functions.

The limitation of traditional AI models in delivering accurate decision-making results is that they are designed to fit the data that the business already has. This bottom-up process leads to data scientists concentrating more on data-related problems rather than focusing on business outcomes. Little wonder then that, despite an average of $75 million being spent by Fortune 500 companies on AI initiatives, just 26% of them are actually put into regular use.

Decision Intelligence models work on a contrarian approach to traditional ones. They operate with business outcomes in mind – not the data available. Decision Intelligence combines ML, AI, and Natural Language queries to make outcomes more comprehensive and effective. By adopting an outcome-based approach, prescriptive and descriptive solutions can be built that derive the most value from AI. When the entire decision-making process is driven by these Decision Intelligence models, the commercial benefits are realized by every part of the organization.

Decision Intelligence Delivers Enterprise-Wide Benefits

Incorporating Decision Intelligence into your operations delivers benefits that are felt by every part of your business. These benefits include:

  1. Faster Decision-Making:
    Almost every decision has multiple stakeholders. By making all factors transparently available, all the concerned parties have access to all the available data and predicted outcomes, making decision-making quicker and more accurate.
  2. Data-Driven Decisions Eliminate Biases:
    Every human process data differently. When misread, these biases can impact decisions and lead to false assumptions. Using Decision Intelligence models, outcomes can be predicted based on all the data that a business has, eliminating the chance of human error.
  3. Solving Multiple Problems:
    Problems, as they say, never come in one. Similarly, decisions taken by one part of your operations have a cascading effect on other departments or markets. Decision Intelligence uses complex algorithms that highlight how decisions affect outcomes, giving you optimum choices that solve problems in a holistic, enterprise-wide way, keeping growth and objectives in mind.

Decision Intelligence: One Technology, Many Use Cases

Decision Intelligence tools are effective across a multitude of business applications and industry sectors. Here are some examples of how various industries are using Decision Intelligence to power their growth strategies:

  1. Optimizing Sales:
    Decision Intelligence can get the most out of your sales teams. By identifying data on prospects, markets, and potential risks, Decision Intelligence can help them focus on priority customers, predict sales trends, and enable them to forecast sales to a high degree of accuracy.
  2. Improving customer satisfaction:
    Decision Intelligence-based recommendation engines use context to make customer purchases easier. By linking their purchases with historical data, these models can intuitively offer customers more choices and encourage them to purchase more per visit, thus increasing their lifetime value.
  3. Making pricing decisions agile:
    Transaction-heavy industries need agility in pricing. Automated Decision Intelligence tools can predictively recognize trends and adjust pricing based on data thresholds to ensure that your business sells the most at the best price, maximizing its profitability.
  4. Identifying talent:
    HR teams can benefit from Decision Intelligence at the hiring and evaluation stages by correlating skills, abilities, and experience with performance benchmarks. This, in turn, helps them make informed decisions with a high degree of transparency, maximising employee satisfaction and productivity.
  5. Making retail management efficient:
    With multiple products, SKUs and regional peculiarities, retail operations are complex. Data Intelligence uses real-time information from stores to ensure that stocking and branding decisions can be made quickly and accurately.

Incorporating Decision Intelligence into the Solutions Architecture

CTOs and solutions architects need to keep four critical things in mind when incorporating a Decision Intelligence into their existing infrastructure:

  1. Focus on objectives:
    Forget the data available for a bit. Instead, finalize a business objective and stick to it. Visualize short sprints with end-user satisfaction in mind and see if the solution delivers the objective. This approach helps technical teams change their way of thinking to an objective-driven one.
  2. Visualize future integration:
    By focusing on objectives, solution architects need to keep the solution open to the possibility of new data sets arising in the future. By keeping the solution simple and ready to integrate new data as it comes in, your Data Intelligence platform becomes future-proof and ready to deliver answers to any new business opportunity or problem that may come along.
  3. Keep it agile:
    As a follow-up to the above point, the solution needs to have flexibility built in. As business needs change, the solution should be open enough to accommodate them. This needs flexible models with as few fixed rules as possible.
  4. Think global:
    Decision Intelligence doesn’t work in silos. Any effective Decision Intelligence model should factor in the ripple effect that a decision – macro or micro – has on your entire enterprise. By tracking dependencies, the solution should be able to learn and adapt to new circumstances arising anywhere where your business operates.

Machine learning and artificial intelligence are niche technologies, and companies have started thinking about or utilizing these technologies aggressively as part of their digital transformation journey. These advancements have changed the demand curve for data scientists, machine learning, and artificial intelligence technologists. Artificial intelligence-driven digital solutions require cross-collaboration between engineers, architects, and data scientists, and this is where a new framework, “AI for you, me, and everyone,” has been introduced.

To Sum Up

Decision Intelligence is a powerful means for modern businesses to take their Artificial Intelligence journey to the next level. When used judiciously, it helps you make accurate, future-proof decisions and maximize customer and employee satisfaction, letting you achieve your business objectives with the least margin of error.

AI/ML for You, Me, and Everyone

Enterprises are adopting technology at an unprecedented speed as COVID has fast-tracked the digital transformation journey by a couple of years at least. Enterprises are focusing on innovative solutions to enhance customer satisfaction, optimal cost management, planning, etc., to stay ahead in the market; this is where digital transformation plays a critical role.

Where does AI stand in Digital Transformation, and how does it matter to businesses? 

Digital transformation integrates digital technology into different verticals of any enterprise, such as operations, delivery, and management. It is defined in four broader categories: process transformation, business model transformation, domain transformation, and organizational transformation. Process transformation mainly focuses on analytics and artificial intelligence-driven insights to automate processes and robotics, whereas business model, domain, and organizational transformations are centered around strategic decisions. Business model transformation redefines a company’s digital journey and how it adds value to its customers and overall business. Domain transformation fuels company growth by expanding the businesses into new domains, and organizational transformation is about adopting best industry practices within the organization.

The digital transformation market is expected to surpass the 1 trillion USD mark in 2025 from 469.8 billion USD in 2020 at a compounding growth rate of 16.5%.

Machine learning and artificial intelligence are niche technologies, and companies have started thinking about or utilizing these technologies aggressively as part of their process transformation journey. Market experts estimate that artificial intelligence-driven solutions will add approximately 13 trillion USD to the global GDP by 2030 and transform the world as electricity did almost 100 years ago. The research report below supports this prediction by depicting the three key digital transformation statistics that will play a crucial role in transforming an organization’s business.

These advancements have changed the demand curve for data scientists, machine learning, and artificial intelligence technologists. Artificial intelligence-driven digital solutions require cross-collaboration between engineers, architects, and data scientists, and this is where a new framework, “AI for you, me, and everyone,” has been introduced.

AI for you, me, and Everyone framework

Before designing any machine learning solution or application, architects must understand the complete landscape. If they fail to understand it, challenges such as productionize ML pipeline, automated retraining, real-time inferencing, etc., will affect their workflow, which they never experienced outside the machine learning environment. The same reasoning applies to product owners and engineers, and they should be familiar with the areas where AI/ML can be applied or cannot be applied, along with its limitations. COVID has sparked the demand for data scientists at an all-time high, and this skill is in short supply.

In one of the survey reports, I found that over 50% of the workforce will be preparing for artificial intelligence or technologies revolving around data science, and corporates have started investing heavily in upskilling the talent internally. This is where the “AI for you, me, and everyone” framework becomes applicable as it ensures that over 50% of your workforce is upskilled around data science enabling workflows.

Daunting Challenges of business across industries

  • Software companies are finding it tough to onboard data scientists, ML engineers, ML architects, or product owners who understand industry-wide machine learning applications
  • The upskilling resources are time-consuming as they have to go through a completely new technology stack
  • Theoretical knowledge is not sufficient, and people can’t be productive unless they have hands-on experience
  • Lack of bandwidth from office work, perseverance, benefits, and industry trends keep people unskilled or unaware of these technologies  

How does “AI for you, me, and everyone” framework help overcome these challenges?

Companies driving digital transformation should follow industry-wide best practices, and the “AI for you, me, and everyone” framework helps them to upskill their internal talent pool. This framework will not only help companies to ramp up their skills but also help in delivering projects involving trending AI/ML technologies within timelines, increasing market share, mitigating unknown risks, driving client innovations, and many more.

1. Learning paths: Companies must define a curriculum for employees based on their core skills, and enthusiasts must learn artificial intelligence and its enabling technologies with respect to their core skills, as it will help them to get onto the ML track quickly. The below representation is a high-level visualization for Data scientists and ML engineers, which depicts how enthusiasts can transform their career path toward AI/ML or ML engineering. It covers 10 broader areas of AI/ML and ML engineers, and professionals should have a fundamental understanding of these techniques and their applicability.

  • Data Scientists: Data scientists are primarily responsible for building AI/ML solutions and mathematical models and extracting data insights. They should be very well versed in Python, Jupyter notebook, TensorFlow, PyTorch technologies, mathematical concepts used in algorithms, model building, and communicating results to the stakeholders. It is always a good idea to familiarize yourself with at least one cloud AI/ML services, as it gives an edge to your skillset.
  • Architect / ML Engineer: ML engineer or data engineer needs to be well versed with OOP (Object Oriented Programing) concepts in Python, Spark, data ingestion, storage, scalability, pipeline creation, and deployment. They also need to have a good experience in various cloud services, along with their benefits and limitations. ML engineers usually deal with multiple tasks ranging from data acquisition from multiple sources, aggregation, processing, and storage of the data for further analysis. This workflow should be automated by setting up ETL pipelines.
  • Product owners: They should be aware of the latest happenings in the market, including the challenges companies are facing and how you can help them overcome such challenges using AI/ML. In fact, they should also be aware of AI/ML limitations, prerequisites, and areas across industries’ wide applicability as they are going to drive the customer requirements along with a complete review of the client problems, competitor analysis, and designing a comprehensive roadmap for the client.

2. Training: Companies should design a month-by-month training curriculum targeting the business and technical side of emerging technologies or the role of AI in the modern world, which would help them learn these technologies. Such training programs will not only help the people to grow in the learning curve but will also help the company in the long run by having a competitive edge along with credibility and trust. 

3. Certification: People should be encouraged to take AI/ML programs certification as it increases their technical competency. Companies should take the full or partial cost of such certifications and include certification programs in their quarterly or yearly goals. This approach will set the standards and motivate employees to upskill and complete the certification assessment.

4. Mentorship: Training programs are generally centered on imparting theoretical knowledge, but in reality, people come across many more challenges that no book talks about. Companies should assign a problem statement to the employees to work on who are undergoing technical training programs and assign a mentor to supervise them during their solution time. Once the candidate successfully implements 2-3 solutions then they will be comfortable taking on the research themselves and approaching a new problem independently with initial level guidance. 

5. Involvement: Employees should be involved in a project where they will get a chance to work closely with the team on a real-time client dataset and problem. Working on a real-time project allows employees to work with seniors in the team, improves the learning curve, and boosts the employees’ confidence level.

6. Competitions: Employees should be motivated to participate in Hackathons and competitions to improve their skillset. These opportunities and platforms help employees ideate and implement a prototype quickly and get a chance to identify other challenges and find solutions accordingly.

7. Academic Collaboration: The gap between academic institutes and industry is prevalent and needs to be filled in. Companies should leap one step toward and initiate research programs with professors and Ph.D. students. Companies should go back to the institutes with the potential industry problem to find the right solution for it. This way, both professionals and students can learn from each other and solve new problems in their respective industry.   

Exploring the AI/ML use cases:

Every industry is leveraging machine learning to optimize internal and external processes, and it is helping them to make data-driven business decisions. There are many use cases where artificial intelligence (AI) or machine learning is one of the crucial elements. During their training, mentorship, or certification program, AI enthusiasts can pick any use case from the below themes:

  • Personalization in media, entertainment ecommerce
  • Forecasting in supply chain management
  • Cost / Resource optimization
  • Root cause analysis for machines
  • Chatbot for interactive query resolution
  • Defect detections in manufacturing units
  • Sentiment analysis for any product, policy, content
  • Fraud / Anomaly detection
  • Object detection in an image or video
  • Image / Audio / Video Analysis
  • Language translation

Final Words!

AI/ML isn’t a silver bullet. While it can be a powerfully transformative technology that provides enormous value, getting started and learning how to implement AI/ML in your organization doesn’t have to be overwhelming and burdensome. If you’re intrigued by using AI/ML in your organization, this is where you start. Dive into small, manageable pieces to see what works for your business. Bet on technologies aligned to the business context and solve your critical challenges. Schedule a call today to know more about our success stories and AI capabilities.

Can AI ease the messy chaos of Revenge Travel? 

Recently Heathrow Airport saw incidents of mass flight cancellations, delays, and baggage issues thanks to the resurrection of the zeal for traveling amongst people, owing to the bottleneck caused by global travel restrictions. Such is the effect of the revenge travel phenomenon.  

Tired of being locked down for over a year due to the pandemic, people started storming to nearby holiday destinations to break free from the humdrum activities and routine life.  

The travel industry was subject to unavoidable impact due to the Covid shutdown. According to Statista, the worldwide travel and tourism GDP saw a 50% freefall from 10% to 5% in 2020.  

With any unnatural imbalance, an adverse effect is imminent, and in this case, a new trend emerged – Revenge Travel.  

New work trends have paved the way for Revenge Travel

The exhaustion of staying inside their homes for a continued period led to this reactive global phenomenon. Once the cases started to decline and countries across the globe began easing travel restrictions, the vacation-starved populace rearing to make up for lost time and confinement started the trend of revenge traveling. 

While traveling was always an option for people, the revenge travel phenomenon saw its inception as animosity towards not having a choice of leaving their homes.  

As with contemporary trends, revenge travel saw an immense foothold, and people started booking airline tickets like there was no tomorrow. Staycation and workcation trends have emerged amongst organizations across the world, opening possibilities to travel more than usual. People even preferred domestic traveling, and domestic flight bookings beat international flight bookings in July 2021.

So, what exactly is the solution? Like other industries, can technology play an aiding role in easing these issues? Can it help accelerate the performance of the travel industry?

Travel and Tourism –can AI be beneficial? 

Messy travel experiences are an issue for customers, while businesses cannot afford to lose face. Everyone has been the recipient of a messy travel experience at least once in their lifetime. Being allocated a different room and tickets booked for the wrong date or time is something everyone has faced. The classic story of a travel agent messing up one of the most important adventures of people’s life is not something new.  

But travel aggregators have changed the landscape for travel and tourism businesses. AI has made the life of travelers a lot easier by being able to book without visiting travel agents.  

For businesses, AI offers to increase profitability in many ways. Pioneers in AI and data analytics have designed and developed solutions specific to the Travel & Tourism industry, benefiting both businesses and customers. Let us explore some AI-based Travel & Tourism solutions that can drive growth for the industry.  

Managing heavy demands & cancellations 

One of the major effects of the rise in revenge travel is the volatile demand. Flights, hotels, and tourist destinations were overwhelmed at once and the unpredictable nature of this demand brought instability and took the travel and tourism industry by surprise. 

The availability of big data is such a valuable potential to tackle this challenge for many of the players in the industry. Leveraging data to forecast demand based on several factors like customer behavior, price trends, and upcoming events can be the game-changer and help ease the unforeseen demand and excessive cancellation situation that plagues the industry.  

Demand & Cancellation Prediction & Management is an analytical OTA solution from Affine that does this along with predicting inclement weather and the resulting flight delays. By doing this, the solution also helps OTAs equip themselves to handle and assist customers, resolve queries, and manage rebooking in case of cancellations. 

This data powered analytical solution helps OTAs predict demand, reduce cancellations and manage refunds, while improving cash-flow for the business. Effectively managing cancellations and refunds also result in a smooth customer experience and increased brand loyalty. 

Automated query handling – the need of the hour for both OTAs and customers 

With the revenge travel chaos and ever rising flight and hotel bookings, customers have many qualms and queries. The sheer volume of queries paired with the skyrocketing number of customers makes this a herculean challenge for OTA players. 

While agents are necessary to solve certain queries and issues, manual efforts simply can’t hold up to this excessive number of requests and a sea of travelers. 

OTAs need to automate the initial levels of travel queries for a smoother process. Furthermore, chatbots are far superior to manual labor in terms of time management and efficiency in handling the sheer volume of customers.  

Affine’s Contextual AI – Chatbot & analyticsis an AI-based chatbot that handles major customer queries and manages them. Live agents are necessary to solve certain issues but this chatbot only transfers the customer to the live agent when it is absolutely necessary, thus easing the load on agents while efficiently handing most mundane queries thanks to its intelligent capabilities. 

For OTAs, this solution helps improve operational costs and reduce customer service costs by having fewer agents as the chatbot handlesthe majority of the traffic. It also helps understand customer interactions helping improve customer experience and overall customer satisfaction. 

These are just examples of a few solutions, and there are tailor-made solutions to improve almost every aspect of the travel & tourism industry like  

  • Conversion rate 
  • Acquisition cost 
  • Ad impressions and many more. 

 As people are getting more dependent on technology day by day, providing a smooth customer journey is essential in the long run for players in the travel industry. Leveraging the abundance of data and the excellence of AI and ML technology provides an airtight business practice headed towards sustainability & success. 


The post-pandemic era has brought some drastic changes to the lifestyle of people all over the world. The innate yearning for traveling has burst and traveling has become the de-stressing factor for the majority. Hybrid working models for offices and work from anywhere trends have opened the possibilities to travel with just a laptop and an internet connection. 

Revenge travel may be a one-time phenomenon, but it has awakened the deep desire to travel within the populace across the world.  

Revenge travel is just a setting stone for what is in store for the travel and tourism industry. The travel and tourism industry needs solutions that will help them operate efficiently and rake in higher margins. Booking agents are history and travel aggregators are competing across the industry, but AI-specific travel solutions will help travel and tourism businesses equip themselves with the future-ready foolproof tools required to sustain.  

What does Affine bring to the table?   

Affine is a pioneer and a veteran in the data analytics industry and has worked with space-defining Logos like Expedia, HCOM and Vrbo to name a few. From travel & tourism to game analytics, & media and entertainment, Affine has been instrumental in the success stories of many Fortune 500 global organizations; and is an expert in personalization science with its prowess in AI & ML.   

Learn more about how Affine can revamp your Travel and Tourism business!  

AI to fuel the Film industry’s future

The worldwide revenue for theatres fell from an all-time high of $41.7 billion in 2019 to a jaw-dropping $11.9 billion in 2020. The film industry took a deadly hit from the pandemic, and the following lockdown brought the industry to its knees and raised questions about its future.

Source: Statista

Ever since the onslaught of OTT platforms, the media and entertainment industry has shaken up, and a new form of revolution has set the foundation. The film industry is one such domain that has been the recipient of the adverse effects of this revolutionary transformation in the past decade.

While the big screen and an unparalleled cinematic viewing experience are still unchallenged to an extent, access to home entertainment and content on demand is a dent to the box office.

The Pandemic Saga

One of the biggest jolts for the film industry to date has been the pandemic, which brought things to a screeching halt and left the industry high and dry. Movie theatres had to shut down due to lockdown measures, and people confined to their homes took an interest in gaming and streaming shows on their couches as alternatives.

The result? Box office revenues plummeted to an all-time low!

The challenge lies in the future

The 2020 numbers look dreary, but as lifestyles return to normalcy again post-pandemic, the film industry still has a challenging task. Consumer behavior has changed. The average content consumer has seen value from OTT platforms that provide quality content on tap, and film as a product has deteriorated in value. Video on Demand offers immense value, and this is a critical film industry challenge that needs addressing.

If the five-year forecast from 2020 to 2025 is anything to go by, it is not going to be a smooth journey for the film industry. The OTT platforms have wreaked havoc with value entertainment at their tap and dethroned the film industry, aided by the unforeseen pandemic.

Source: Statista

But the charm of watching a movie on the big screen is unparalleled. The industry needs to revamp its practices in the process of film production. While a passion for the craft fuels the art of filmmaking, the technical and strategic processes stand to immensely benefit from AI practices explicitly designed for the film industry.  

Production and promotion- areas that need efficiency the most

A film’s success or failure has always been a gamble, but the production effort and cost are constant across most film titles. Solutions implemented right from the pre-production phase can result in substantial, measurable impacts.

Many studios spend an insane amount of funds on marketing and promoting their movies. With the current advertising landscape seeing a transformation, thanks to the latest content consumption habits, promotional budgets need to be scrutinized irrespective of the production scale.

Source: Statista

Save for the slump brought by the pandemic, the promotional budget for movies has seen an upward surge in the previous years and is back on track for 2021, which means higher spending and a bigger overall budget. While this amplifies the reach of the film across the globe, there are two main challenges here:

  1. Many small and medium-sized studios cannot splurge on sky-high budgets to promote their movies.
  2. Even big production houses sometimes go overboard with the promotions, and the movies earn less than expected.

Efficient promotions are the only way to go forward irrespective of the might of the production houses.

Commercial Forecasting System

Hollywood is no stranger to big-budget titles bombing at the box office while total underdogs clinch big victories. Sometimes there have been instances of a movie bombing locally but performing exceptionally well at international box offices like China.

This AI (Artificial Intelligence) based project management system from Affine helps production companies execute smart, efficient insight-filled decisions across the film’s production processes.

With this AI solution, production companies can predict the performance of their movies on local and international markets and across various demographics and populace at respective production stages of the film.

Production industries can stand to gain benefits as mentioned below by leveraging the Commercial Forecasting System:

  • Ascertain key foresight into film performances well in advance
  • Make necessary changes in the preliminary stages of production
  • Project realistic output numbers
  • Carry out efficient and data-driven marketing/promotional activities in tune with the film’s predicted performance across demographics and media types

Script Analysis

Time and time again, it has been proven that a good script is a foundation for a successful movie. With the diversity of content today, it is challenging to design a script that will assure superior performance at the box office.Script Analysis is an AI and ML (Machine Learning) solution that learns from the plethora of data fed into it and analyzes the storyline to determine its success in respective release regions, even at a pre-production phase. Historic film data helps the solution analyze similar script performances and predict the outcome with near-perfect accuracy over the micro level of demographics and age groups.

With the Script Analysis solution, production companies can leverage the benefits mentioned below:

  • Predict the near-accurate outcome of a script if it’s shot into a movie
  • Ascertain valuable insights that help make data-driven business decisions well before the production stage
  • Green-light scripts that are assured of performing well while making necessary changes to scripts that are not as optimal for business

Talent and Casting Analytics

Many great movies have had surprise castings that worked for them and changed fortunes for both – the filmmakers and the talent. But there have been cases of miscasts that have ruined good movies as well. Leaving casting to gut feeling is not feasible anymore and must be treated like any other business process.

Many production businesses have already adopted AI-based casting methods to choose the right talent optimally. Affine’s Talent and Casting Analytics leverages data to generate insights on the impact of key talent on a movie’s box office performance.

Production companies can indeed gain advantages from the Talent and Casting Analytics solution in the following ways:

  • Provides casting suggestions based on historical roles and in the actor’s portfolio
  • Use the cast as a variable to determine the film’s performance at the box office
  • Rank and simulate talent options based on their economic impact across the film industry like media type, genre, and key territories

AI-powered box office predictor system

The sheer number of filmmakers has grown over the years, and many are challenging each other at the box office, which may be a treat for the viewers, but as a business, production houses can end up with losses.

At the end of the day, the commercial success of a film is just as crucial, if not more than its critical acclaim. If all the above solutions are the factors of the success equation of a movie, then an AI-powered box office predictor system is the main act.

With this solution, production houses, independent filmmakers, and distributors can predict the movie’s box office performance up to 6 months in advance. The plethora of business opportunities this solution provides is immensely insightful and can help film businesses make valuable decisions.

With the Affine’s solution, you can leverage the following:

  • Predict film revenue at the box office well in advance with the highest accuracy rate
  • Decision makers take steps for ROI (Return on Investment) improvement
  • Forecast the promotional/marketing effort required per box office performance across regions, genres, and many other factors

The film industry will sustain AI behind the scenes

Films are not going anywhere, irrespective of the competitors. But the post-pandemic era comes with many changes due to multiple factors, ranging from content consumption behavior to global inflation.

People worldwide are in a price-sensitive phase, which brings the need for film production companies to improvise the game-plan. With the Film industry-specific AI practices, they stand to benefit from box office success and an efficient production, casting, and marketing process, contributing to the overall ROI.

What does Affine bring to the table?

Affine is a pioneer and a veteran in the data analytics industry and has worked with giants like Warner Bros Theatricals, Zee 5, Disney Studios, Sony, Epic, and many other marquee organizations. From game analytics to media and entertainment, Affine has been instrumental in the success stories of many Fortune 500 global organizations; and is an expert in personalization science with its prowess in AI & ML.

Learn more about how Affine can revamp your film production business!

See the World Through Your Lens: Introducing Next-Gen AI Satellite Image Segmentation Solution “TELESCOPE”

Assume you’re in the real estate business or own agricultural land and want to discover every detail about a bit of land/location sitting a thousand miles away. Doesn’t that sound a little too intricate? Well, Telescope helps you to do it in a snap! 

What is Telescope?

It’s a next-generation AI satellite image segmentation solution capable of resolving complex business and significant operational requirements. Telescope uses a machine learning framework to classify information in a digital image (i.e., buildings, roads, grasslands). It then generates output segmentation data, which can be utilized for diverse business purposes such as pattern identification and object tracking.

Telescope emphasizes the concept of leveraging AI and has established a software package that utilizes cloud services to allow you to extract valuable data pertaining to your business. This platform lets users perform real-time image analysis on high-resolution satellite images and view adjacent locations with the accurate coverage percentage of greenery, land, buildings, and water bodies.

  • Telescope uses a cutting-edge combination of Computer Vision and GIS technologies that allows to automatically retrieve high-resolution satellite images of sites with up to 100 square km of magnification
  • It enables single or multiple pairs of lat-long parameters or location names in various formats
  • Uses a Deep Learning Feature Pyramid Network (FPN) model with a point rend module for precisely predicting the label maps
  • It will allow you to assess variations in water bodies or land shapes like dams, rivers, deserts, and mountains

Telescope provides a broad range of real-world applications such as monitoring deforestation, urbanization, traffic recognition of natural resources, urban planning, etc. This novel technology can also be used in critical missions for the uniformed forces and in monitoring catastrophic catastrophes like volcanic eruptions, wildfires, and floods, among other things. With an intuitive interface and a substantial reduction in manual effort, it automates land surveys and minimizes the requirement for data collection associated with mapping and construction operations.

In other words, this solution enables faster and more accurate data processing, reducing time and cost, and is very handy for professionals. Furthermore, the solution’s atomized approach yields quicker and more accurate outcomes, with much less reliance on third-party data sources.

What is More Exciting About Telescope?

  • Its segmentation technique automates the process of extracting structures like land, lake, etc., from satellite images without the requirement of any advanced skills in Computer Vision or geospatial data
  • It has been built using advanced image processing algorithms, offering the most accurate results and seamless integration with real-time API services
  • The solution can be readily incorporated into current corporate GIS systems or used as a standalone solution to handle geographic data
  • Exports multiple high-resolution images leveraging Google Maps as a default source for satellite images
  • Highly interactive dashboard for reporting and visualization
  • Highly accurate percentage-wise representation of the areas covered in various categories
  • Save your planning time up to 70% and costs up to 30%

A Bigger Picture Beyond the Flaws

Whether it’s a mission for any uniformed forces, monitoring catastrophic events, feasibility analysis for mobile tower setup, or planning a smart city, Telescope is a one-in-all solution for your business need. Request a demo to learn how to leverage this solution for your business need. Be ahead of the technology crux! 

Price Elasticity: How vulnerable is your product in the market?

Product Pricing

What elements do we consider when selecting a product’s price? Is it entirely dependent on customer demand? Firms & marketers are constantly striving to unfold the relationship between sales demand and price fluctuation to find a pricing point that is best for their products. Let us look at price elasticity to reveal the facts behind this relationship.

What is Price Elasticity?

Price elasticity of demand (PED) is an economic measure representing the responsiveness, or elasticity, of the quantity demanded to the change in the price of a product or service. To simplify, it is the ratio of percentage change in quantity demanded of a product in response to the percent change in its price.

Most markets are sensitive to the price of a product or service, the cheaper the product higher the demand, and vice versa. While this need not be true for all products and services, price elasticity as a quantifiable phenomenon shows precisely how sensitive customer demand can be for a product price. For starters, let us look at questions professionals in marketing try to answer when determining the elasticity of their products:

  • How much more can you sell by lowing the product price?
  • How will the rise in the price of one product affect sales of the other products?
  • How will a decrease in the market price of a product affect the volume of production & market supply?

What Is Price Elasticity of Demand & Supply?

Let’s look at Coca-Cola to try and understand these concepts. Assuming that a bottle of Coca-Cola regularly costs $1, if the price surges to $2, it will likely result in a dip in demand as most people would consider it expensive. On the other hand, if you drop its price to 10¢, you will notice a significant rise in its demand.

Price elasticity springs from the fundamental economic law of supply and demand:

  • Cheaper the product, the higher the demand
  • And the more expensive a product becomes, the lower the demand

How likely is Sales Demand to Change When Price Changes?

Price elasticities are usually negative; when our price decreases, our sales demand increases. It makes sense, doesn’t it? However, positive price elasticities are found in rare cases where products do not conform to the law of demand. 

Yes, these case scenarios happen when we have Veblen goods. Veblen goods are typically high-quality, exclusive, and a status symbol. Example – Louis Vuitton

What will the Price vs Consumer Demand Curve Look Like?

The chart above shows that a 33% increase in price point decreases consumer demand by a million, whereas demand doubles if we drop it by 50%.

Mathematical formula to calculate the price elasticity –

Price elasticity (E) is the percentage change of an economic outcome (which is generally the number of units sold) in response to a 1% change in its price:


% Change in Quantity Demanded = (New Quantity – Old Quantity)/Average Quantity,

% Change in Price (P) = (New Price – Old Price)/Average Price

For example, let’s say that a clothing company raised the price of a coat from $100 to $120, and the 20% increase in price caused a 10 % decrease in the quantity sold from 1,000 coats to 900 coats. Formulating these numbers gives you a price elasticity of demand which is 0.5, as mentioned below:

-.10 / -20 = -0.5 or 0.5

Types of Price Elasticity

  1. Perfectly elastic: Minimal change in price results in a substantial change in the quantity demanded
  2. Relatively elastic: Minor changes in price cause a tremendous change in quantity demanded (E > 1)
  3. Unit elastic: Any change in price is matched by an equal change in quantity (E =1)
  4. Relatively inelastic: Substantial change in price causes minor changes in demand (E < 1). Gasoline is a good example. As an essential commodity, demand stays relatively the same even with an increase in its price.
  5. Perfectly inelastic: Quantity demanded does not change even with a price change. There is no elasticity of demand or supply for these products. Perfect inelasticity happens with products or services where the consumers do not have any other substitute goods. For example, food, medication, etc.

Factors of Price Elasticity:

  1. Purchase probability: The likelihood of a customer purchasing a product from a particular category. For instance, in the case of beer, where there may be various brands with differing prices from the same product category, purchase probability determines how likely a customer will buy one brand instead of the other. Assuming we can compute the aggregate price of a category and according to the rule of demand, the higher the price, the lesser the demand, and the likelihood of purchasing beer decreases with the aggregate rise in its price. Calculating the price elasticity reveals this change in demand.
  2. Brand choice probability: Brand choice probability defines the customer’s choice. If you work for Oreo, you are more likely to be concerned with Oreo’s brand compared to the overall biscuit sales. That is why marketers focus on persuading clients to select their brand over their competitors. When the cost of a product from a brand increase, the chance of purchasing that brand decreases.
  3. Purchase quantity: Purchase quantity represents a customer’s expected purchase.

Analytical Model Implementation

We understand the business context of price elasticity and how important it is for every business to maintain a healthy financial sheet. The following operational question is how to solve / implement it. We have used Linear regression, either the Ordinary Least Square (OLS) or Recursive Least Squares (RLS) method, to predict quantity from the price change over time for any specific product.

Linear Regression Equation:


β is the beta coefficient (slope). Generally, beta is negative with respect to quantity sold, except for few exceptions where it can be positive.

Multivariate Linear Regression:

If additional supporting factors are directly linked with sales, we may utilize them in multivariate linear regression alongside the price variable. 

Price Elasticity Score:

To get the elasticity value of a product, we need to use equation #1 from above:

As price & unit have a linear relationship, beta (dy/dx) will remain constant. Hence, we can use the average value of price & units to get the elasticity score.

Business Application:

Once ready with the price elasticity model, the next step is to apply the business learning to reveal a product’s sensitivity in the market

The above chart shows that the Samsung-65 Class LED TV with a negative price elasticity of -17.68 will have 170.6% more demand with a 10% drop in its price or lose 170.6% of its sales demand with a 10% rise in price.

Whereas the Sony XBR-X850E-Series 75-Class TV with a positive price elasticity of 7.19 will notice a 71.2% drop in its sales demand with a 10% price reduction and a 71.2% boost in sales demand with a 10% increase in its price.

What Next?

New problems arise the moment you solve a business problem. For instance, what is next? Or is there a better approach? What can we do additionally to improve the implemented solution? The case is the same for price elasticity models. We see that changing the price of a product affects its sales. But what happens with a change in the price of a competitor’s products? The phenomenon that causes a shift in demand for one product from a change in the price of competing products is known as Cross-Price Elasticity. But more on that in the next blog!


Copyright © 2024 Affine All Rights Reserved

Manas Agrawal

CEO & Co-Founder

Add Your Heading Text Here

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged.