Azure Powered Telescope®- Leveraging the best among the Azure stacks!

Our new offering in satellite image segmentation Telescope, powered by Azure, can be integrated seamlessly into Azure storage/database services and build customer-oriented applications to minimize geospatial analysis challenges.

During the last few years, the number of satellites has exploded exponentially. While there were fewer than 20 remote sensing satellite launches in 2008, in this year alone, there have been greater than 150. The amount of data being acquired from satellites is also increasing thanks to the falling costs of electronic components and machine vision exponentially, along with increasing private sector participation. As per a recent report, the global geographic information systems (GIS) market is expected to reach US$13.6 billion in 2027, up from US$6.4 billion last year.

In parallel, Artificial Intelligence (AI) has also been maturing quickly in the last few years, allowing organizations worldwide to automate drawing insights from vast quantities of data at a faster pace than ever before. A vast trove of satellite image data is waiting to be utilized for value generation across multiple domains varying from real estate, military, agriculture, urban planning, and disaster management, to name a few. This is where Affine seeks to add value with our home-grown tool, Telescope®.

What is Telescope®?

Telescope® is a next-generation AI satellite image segmentation solution capable of resolving complex business and significant operational requirements. Telescope® uses an in-house developed machine learning framework to classify information from a satellite image into one of the following six categories: buildings, greenery, water, soil, utilities, or others. This AI-generated output segmentation data can be utilized for diverse business purposes such as pattern identification and object tracking.

How can Telescope® help businesses?

Telescope® emphasizes the concept of leveraging AI and has established a software package that utilizes Azure cloud services to allow you to extract valuable data pertaining to your business. This platform lets users perform image analysis on high-resolution satellite images and view adjacent locations with the accurate coverage percentage of greenery, land, buildings, and water bodies. It makes use of information that can be compared to the street-level dimension. For example, it can differentiate buildings, whose size ranges from tens of meters to utilities, such as roads that run in meters.

Microsoft Azure role in Telescope®

Azure helps us to make effective use of cloud computing. When using Telescope®, we don’t need to use the model for inference on a real-time basis but as the demand arises. So, the businesses will be charged only as per the number of instances utilized. At the same time, the user history and information are to be accessed much more securely and restrictedly. Azure’s enhanced flexibility makes it easier to deploy the application. Using Azure Functions, we can also ensure that Telescope® scales automatically as the usage may fluctuate over time.

During the model training, Azure ML was quite useful in bringing down costs. The model training involved close to 1000 images being marked for segmentation. This involves a large pipeline of cleaning, labeling, and analyzing image data. Further, we had to train the segmentation model with tuning for various hyperparameters. It required using powerful GPUs. With Azure ML, we were able to allocate GPUs only for the duration of the training and thus bring down the cost. With Azure pipeline, we could automate the training process, bringing down the efforts effectively.

The results from Telescope are saved in comma-separated values (CSV) format, which can be seamlessly integrated with any Azure Database services. Using the information saved in these databases, we can effectively build PowerApps applications addressing the client’s requirements.

Azure Marketplace Consulting services for the AEC industry:

  1. Land Survey using Azure & Telescope®: Leverage Azure Services to conduct a land survey with valuable insights that can help convert the aerial dataset into a CAD site plan using our AI-based Land Survey consulting offering.
  2. AI-based Site Feasibility Study using Azure Service & Telescope®: Leverage Azure services for a feasibility study of your upcoming construction project using Affine’s AI-based Site Feasibility Study 6-week PoC consulting offering.

How can Azure-powered Telescope help the Architecture, Engineering, and Construction (AEC) industry?

  1. Architecture, Engineering, and Construction (AEC): One of the domains with vast potential for Telescope® is AEC. Site Feasibility Study plays a crucial role in the construction project management process. It helps companies map the road ahead and determine whether desired outcomes align with reality. Before going out to the field and examining, having a good estimate in hand will help determine the feasibility of a location quite early on. This can save the builder time and resources in capacity planning and impact assessment. With Telescope®, one can quickly estimate how the land is being used and to what possible areas a project can be expanded to.
  2. Property survey: Another area where Telescope® can contribute is real estate valuation. Before determining the specifics of the value, be it for insurance, rental, or sales/purchase, the interested parties would like to know about the surroundings of the buildings. These factors can influence prices a lot, such as the density of the buildings nearby, the presence of parks or lakes, access to the metro/highway, etc. This information is helpful for parties on both sides of the deal. Both parties can quantitatively obtain the details and use them for predictive modeling to get a fair valuation.
  3. Reconstruction post-catastrophic events: With the help of Telescope®, you can monitor and quantify the impacts of catastrophic events such as volcanic eruptions, wildfires, and floods, which can be used for emergency response as well as reconstruction. Regular satellite updates allow you to analyze how the destruction has spread and compare it with pre-catastrophe levels. In the case of reconstruction, using the segmentation data businesses can speed up the estimate process, which is the time-consuming part of the reconstruction.

Features of Telescope®

At the core of the Telescope® tool lies Affine’s proprietary backend deep learning algorithms. This state-of-the-art framework has several advantages. Our backend algorithms produce crisp object boundaries in regions that are over-smoothed by previous methods providing more accurate results. These algorithms’ efficiency enables output resolutions that are otherwise impractical in terms of memory/compute utilization compared to the other approaches; this allows us to use lower resources and pass on the lowered cost to the client.

Another major feature that the Telescope® tool can provide is the simple interface. You can feed the geospatial coordinates (latitude and longitude) or select the location from the map to perform the analysis. Then the Telescope® tool will detect some area around that region (to the order of 0.1 square km) and perform the segmentation task. With such a simple approach, even someone not well-versed in geospatial analysis can start using the Telescope® tool with ease.

Telescope® is at its current state, a generalized solution to segment any kind of structure from satellite images used for different business requirements. Hence it can be adapted to other use cases with the added advantage of its quick integration and deployment using the solution APIs.

Role of AI in Telescope®

The AI revolution is impacting all sectors and opening a new door in geospatial analysis for business purposes. One common theme that we can see across all sectors is how it simplifies human effort. Before venturing out into the field, it helps you understand the problem’s complexity and how to approach it. For instance, in the real estate sector, it is impossible to perform any of the decision-making without going out to the field. However, this is a logistically expensive process. Instead, what if one can understand the property of interest and get the numbers all the while sitting in front of your computer? That too at a fraction of the cost of the field visit? This makes the decision-making process much faster while providing tools to do the reasoning transparently. This is what we seek to achieve with Telescope®.

Our vision: How are we envisioning Telescope®?

Telescope® is our major leap into geospatial analysis, which is very much in line with the vision of Affine: “to bring about the evolution of business decision-making through the adoption of the new in decision science and technology.” As a forward-thinking company, we have always believed in staying at the bleeding edge of decision science through a culture of celebrating excellence, continuous learning, and customer orientation. We strive to be a catalyst for business transformation underpinned by AI, Data Engineering, & Cloud. And Telescope® encompasses all these aspects.

That does not mean that we are at the end of our road. We pursue excellence to deliver the best possible results. Our approach and efforts are always backed with the intent of delivering improved customer satisfaction. With this, we share our vision for Telescope®.

Notably, anyone with access to the internet knows how to use online maps. Armed with such basic knowledge, they can also operate Telescope® quite easily. However, as customer requirements get sophisticated, the commonly available online mapping interfaces may not be enough and require more complex data analysis. We have designed our APIs so that, with minimal change, they can integrate more complex satellite data, be it open government databases such as LANDSAT and Copernicus or databases from a proprietary vendor.

  • Time-dependent analysis: We can perform time-dependent analysis well with access to new datasets. Depending on the customer requirements and the data vendor, the frequency of data acquisition or access can change periodically. Once this is configured, Telescope® can easily process this periodic information. The results obtained can then be effortlessly monitored and analyzed suitably by the users.
  • Fine-grained analysis: We also seek to enhance Telescope® with more fine-grained analysis capabilities. For example, when some buildings are detected in Telescope® to provide further details such as how high the building is. Or if some greenery is detected, if it is a forest or agricultural land, and if agricultural land, what sort of crop is used, etc. This multilevel analysis will provide more information, empowering the customer to make more nuanced decisions.
  • Drone surveillance: We also consider another upcoming domain within a geospatial analysis, i.e., drone surveillance. Drones are gaining more popularity and sophistication. The time is not far when even drones are regularly utilized for geospatial analysis and progress monitoring in the AEC industry. We have designed our core model keeping this in mind. The image resolution we use for analysis is comparable to the images that can be acquired from drone-captured images. Tasks such as excavation and earthwork progress monitoring could be accomplished by analyzing the imagery.  However, drones come in large varieties and diverse regulations in different countries. Hence, we intend to develop a standardized methodology for image acquisition, after which Telescope® will be able to process even drone-captured images.



Comparing the Big Three – AWS Vs. Azure Vs. GCP – From the PoV of an AI & Analytics Services Partner

Unless you have been living under a rock or are completely alienated from the tech world, you’d know that Amazon Web Services (“AWS”), Microsoft Azure (“Azure”), and Google Cloud Platform (“GCP”) are the three major public cloud providers in the world today, with a combined Market share of approx. 64%. Trailing behind them are Alibaba Cloud, IBM Cloud, Oracle Cloud, and others, who, in my opinion, need to either go niche or take a big bang approach if they want to catch up with the dynamic requirements across industries being catered to by the big three. But have you ever thought about what the big three bring to the table from the point of view of an AI & Analytics services partner? Let’s take a closer look at how each of these service providers differs in its value proposition.

What makes AWS, Azure & GCP the same, but still different from each other?

Talking about the three biggies, while AWS is the oldest of the lot, a lot mature in multiple aspects such as ease of integration & technical prowess of their products and kicking it out of the park in terms of their market share, their partner programs can learn a thing or two from Azure’s playbook. From strategic sales relationships with partners & joint GTM to general sales teaming to grow partners’ business, AWS is slightly behind. On the other hand, given they have been longer in the market and are a lot bigger in size, the sheer volume of demand that they see could potentially more than make up for this lack.

It’s not all bad for AWS, though – Irrespective of the size of the partner, they provide a personal touch during partner onboarding and help navigate the ocean that’s AWS, which is more than what I can say for Azure. Although a leader in profits & pricing for partners, Azure is notorious when it comes to personal touch & handholding for new partners, unless you can show the size and $$$$. Even for lead sharing and demand generation, Azure is known to prefer existing & known partners to new ones (not that I blame them!). I like to call Azure the “Business” cloud, as it has an unequivocal focus on improving its customers’ business rather than focusing too much on the technology perfection side of things.

All said, Azure today is one of the leading choices for companies as an enterprise cloud due to its native integration with other Microsoft products such as M365, Teams, Dynamics, ERP, etc., and superior support for Hybrid Cloud infrastructure. On the other hand, AWS, with its whim to stay away from being a hybrid or private cloud, is lagging in this area, and with a bulk of Fortune 500 companies moving to the Hybrid model, it may hurt AWS in the long run and may knock a few points off its market share.

Just like Azure, GCP also offers great support for the Hybrid Cloud model through its offering, Anthos, but is currently being considered, primarily, as a support cloud and not an enterprise one due to the low maturity of larger product capabilities (both IAAS & PAAS), low ease of integration and less evolved documentation, processes & features. From a partner’s standpoint, GCP is actually not behind on demand generation & lead sharing (as compared to Azure), and that is saying something, given that GCP came into the enterprise limelight only in 2019-2020 timeframe. However, due to less evolved partner programs, lower profitability, and limited sales relationship & support, GCP and its partners have not seen the kind of growth they set out for.

That said, with Mr. Kurien at the helm, GCP is very quickly making a name for itself as a huge proponent for its customers & their businesses, even with a highly developer-focused tech stack, which is open-source friendly and DevOps centric. They have also started ramping up their partner programs & strategy, which has grown by more than 400% in the last couple of years.

Capabilities of AI, ML & Analytics in AWS Vs. Azure Vs. GCP 

Talking about Artificial Intelligence, Machine Learning & Analytics capabilities, despite its slow growth, GCP is coming out as a “hands down” leader with its powerful infrastructure, low latency & superior performance for high-end computing workloads. This is further strengthened by its Data based DNA – Due to their other free product offerings, GCP has had access to tons of data which has allowed them to create best-in-class AI capabilities. On top of this, GCP has further distanced itself from AWS & Azure by unveiling the Vertex AI workbench in 2021, which brings Google’s ML services under one roof to simplify the process of building, training & deploying ML models at scale.

Azure, with its suite of cognitive services and other AI/ML offerings, does not have as broad a spread as AWS or GCP, but the ones available are much more function-specific, which is in line with their “Business” first approach. However, from a performance standpoint, neither Azure nor AWS can match up with Google’s Tensor Processing Unit (TPU).

AWS has the widest variety of services available under the AI/ML & Analytics banner as compared to the other two but offers less flexibility and out-of-the-box algorithms, which makes it less favorable in certain cases.

At an overall level, while the differences in AI/ML & Analytics capabilities are more than noticeable between the three biggies, this area is seeing a continuous infusion of investment from all three service providers and the gap between them is bound to narrow down in the not-so-distant future.

Some of the key AI/ML & Analytics suite of offerings provided by these public clouds are ML as a service, language capabilities, speech-to-text & text-to-speech, vision-based recognition of images and videos, Anomaly Detection, NLP & Text analysis, Conversational AI, and many more.

Summing Up!

With a major pivot in IT & Data strategy from complete On-Prem to Cloud/Hybrid approach, it has become more important now than ever for companies to make the right choice of cloud. Building and operating your systems in-house is no longer necessary. As technology progresses, becomes more versatile and Opex driven, cloud adoption has become an integral part of any company’s IT strategy. With this new paradigm, there has been a shift in how businesses allocate resources across platforms that are most suited to their specific needs. Affine can assist you with this as we have vast expertise moving between all three major cloud providers. Schedule a call today and talk to our cloud experts.

How to Ace Cross-Cloud Migration?

Enterprises moving to the cloud from on-prem is an enormous effort that can span years together. And, once the migration to the cloud is complete, an enterprise might decide to move entirely to a new cloud or parts of the current applications to a new cloud platform for cost, performance, or other reasons. This is a more tedious effort than on-prem to cloud migration. In this blog, we will discuss the cross-cloud migration checklist, workloads optimization, and key considerations that help businesses move from one cloud vendor to another successfully.

Business Purpose:

We need to clearly understand the pros and cons of the source cloud platform, which we expect the team to have as it is their incumbent environment. We must deeply analyze the target cloud platform as it should solve all the cons and have all the pros mentioned above, like performance, low cost, and powerful app features.

Recognize and include stakeholders:

Connect with core team members throughout your organization, including IT and business partners. Early commitment and backing will result in a smoother, quicker cloud migration process.

Evaluate the services of source platform:

  • Perform end-to-end analysis of all the services in source platforms, applications, data, etc.
    • Thorough analysis of the data sources, data flows, data models, and ETL processes
    • Detailed info on databases, tables, partitions/clustering/indexing, data dictionaries
  • Have a close working session with the incumbent developers and IT in the team 
  • Decide on KPIs to measure and report on (Duration, Delay, Disruption, Costs per service/bandwidth)
  • Categorize source data into Hot, Warm, and Cold

Analogous target platform Tools/Services

  • Identify the suitable services/service models (SaaS, PaaS, IaaS) in the target platform which can replace services in the source platform and perform as expected according to the business use cases
  • Implement a proof of concept (POC) to validate the approach for each service

Cost Estimates:

  • Analyze the egress and ingress charges from source to target environment
  • Identifying the overall cost of the source architecture at the service level from the billing dashboards that the source cloud vendor provides
  • Analyze the cost of the services to be billed on the target platform, and it should be comparatively the same or less than the source platform cost

Optimize Workloads for better Performance

  • Remodeling the Data Schema: Rearchitect the Data Schema to better fit with the new services in the target platform, based on its features and processes.
  • Removal of Duplicate Data: Delete the redundant/duplicate data after detailed cross-checking with the business while migration can improve performance.
  • Conversion of SQL: Convert the complex SQLs to simple ones for ease of maintenance and performance. 
  • Revalidation: Re-validate all optimization techniques to ensure better performance as this migration effort is an opportunity to reduce technical debts and address performance. 
  • Rearchitecting the Application: Re-architecting the applications to be functional in the target cloud platform could result in less resource utilization and better performance.
  • Lift and Shift
    • As various cloud vendors follow different approaches to data storing and data accessibility/retrieval, we need to deep dive into their techniques and rearrange our data accordingly
    • 90% of the cross-cloud migration projects will not adhere to Lift and Shift and might need an entire re-architecture
  • Huge data Migrations
    • We need to watch out for the data pulling from source cloud vendors as they charge egress costs and bandwidth costs. Try to compress the data and make it as small as possible, then plan out the migration
  • Long Dependency on an old Cloud provider
    • If we move to a new cloud vendor, try to cut off using services from the old cloud vendors. if not, we will end up paying for both the cloud providers
  • Serverless option for Performance-based processes
    • For services that depend more on performance and readiness, plan wisely to choose between the dedicated pool and serverless options. You might see a 2x to 2.5x performance difference for massive data sets (Terabytes)
    • Observe the performance differences between the two different cloud providers. It may be extremely diverse, and since migration costs are associated, this will become a bottleneck if not planned for in advance
  • Data Cleansing
    • Clean the data as much as possible before getting into the migration process. It will help shrink the record count/size of the data, which is directly proportional to lower migration costs.
  • Pilot run 
    • Before migrating complete data, performing a proof of concept (POC) with a sample of data is essential. This approach will give you a clarity of thought about the process execution and the plan
  • Parallel loads
    • During the migration planning phase, make a checklist of items that need to run in batches and in real-time; prepare the data accordingly. If it is not well planned, the entire system performance might have to be addressed during the later phases, which might include a re-architecture of specific non-performant items
  • Automated testing
    • The absence of automated source-target validation would result in a larger chunk of time taken only for validations or would result in cutting corners for this activity. Loaded data needs to be validated and reconciled with the source system in an automated manner. 
  • Capacity planning
    • Data migration could be a resource-intensive operation and may require capacity planning. It’s always wise to plan the size of CPU, memory, storage, hardware, tools and have them readily available before the beginning of migration so that the migration effort can be successful.
  • Pipeline overload
    • Too many ETL activities result in blockages in debugging and dependent/ waiting in deadlock for previous actions. Try to have pipelines in smaller and granular pipes as much as possible. It will be easy to maintain, debug, upgrade, and replace

Summing Up!

Any cross-cloud migration should be considered an extensive effort that might start as an easy lift and shift and potentially change into an extensive migration and rearchitect effort. There is expertise required in both target and source cloud platforms to ensure no loss of functionality, eliminate tech debts, and to improve performance. This is something that Affine can help you with as we have extensive experience in migrating between all three big-cloud providers. Re-define your business goals by leveraging our Cloud Practices. Schedule a call today and talk to our cloud experts.

5 Key Factors in Industry 4.0: The Game Changer for Manufacturers in 2022

The 4th industrial revolution ushers every Industry into immense transformations, with enormous advantages and implementation challenges. The goal of Industry 4.0 is to integrate physical and digital technologies into a cyber-physical system (CPS) that reflects the digital world in the physical world and vice versa; in addition to that, it also enhances the customer experience. For organizations, implementing Industry 4.0 is a daunting challenge. The process starts with understanding the existing workflow of the business, identifying the bottlenecks, and selecting the right technologies to overcome the workflow and business problems. In order to make this process simpler, our research team and industry experts suggested 5-key factors that help organizations implement Industry 4.0 with fewer complications and faster success.

5 Key Factors- Effective ways to implement Industry 4.0 in your organization

There are numerous factors that organizations ought to consider in order to embrace Industry 4.0 successfully. However, here are the five most essential factors that help shape Industry 4.0 at its core and make the implementation process hassle-free. When integrated, each of these pieces create a capability that can help in bringing transformation.

1. Identifying and Implementing Right Technologies

Customer-centric technologies have enabled OT IT Convergence, highly recommended to manufacturers, mandating them to integrate their IT (Information Technology) and OT (Operational Technology) and transform real-time data into actionable intelligence. Enterprise information technology must securely collect data from OT equipment before processing it and sharing the necessary insights with OT professionals and other internal and external stakeholders to maximize IIoT project return on investment (ROI).

By Integrating OT Plant Data with IT Systems such as ERP, WMS, and other third-party IT suites, whether on-premises or in the cloud, there would be clear shop floor data visibility by integrating Computerized Maintenance Management Systems (CMMS) with a SCADA system, Cloud/Edge-based Remote Monitoring Solutions with these operations.

The next point to mention is that AI (Artificial Intelligence) brings flexibility to adapt various technologies that enable software and machines to perceive, comprehend, act, and learn independently or to augment human operations.

Industrial production can be more convenient than manual processes by incorporating artificial intelligence. These technologies offer vast potential for manufacturing companies to work even faster and more flexibly, which helps achieve the best possible quality while cutting down on the resources used and driving greater production efficiencies.

2. Selecting The Right Use Case for Your Need

When a new technology is introduced, the scope and significance of its advantages are usually unclear until after it is put to use. Early IIoT adopters and digital transformation pioneers served as role models for other sectors who were always interested in exploring the new avenues offered by emerging technologies. It takes a particular vision to apply use cases to a different business, but it eliminates the risk of buying on trust.

Use cases are just the beginning, and the idea is to learn about what others have done or are doing and seek techniques to make the outcomes they’re getting relevant to your business. The first approach is to understand why the company is still attempting to figure out its IIoT strategy and needs to build a list of problem statements planning to resolve. The next step is to look to peer companies, research analysts, and solution providers’ case studies to understand digital transformation use cases, entry points, techniques, and rewards. Start with a PoC having low resources, low effort, and high impact; when the results are visible, start implementing use case simultaneously following the same strategy.

Few use cases offered by Affine for different functions:

3. Identifying the skill gap, filling it with right skillsets

Organizations are recognizing IoT in the paradigm of a highly autonomous production line to envision all of the skills that are required to implement Industry 4.0. It could comprise additive manufacturing techniques, CNC lathes, and newer machines capable of executing highly variable, multi-step processes with the help of robotic vision, artificial intelligence, and cobots that work alongside humans. We now have a technology landscape that requires multiple skill sets and the blending of those skills to cut across silos and unique skills to create entirely new categories of technology professionals—those who understand the convergence of operational and information technologies.

Now is the time to think cross-discipline or multi-discipline. People claim the Internet of Things is about digitizing things when they talk about them. It’s all about digitizing business processes. As a result, engineers, network specialists, application developers, prominent data architects, UI [user interface] designers, and businesspeople must communicate and comprehend each other. Industry 4.0 will necessitate the organization of multidisciplinary teams to solve complex challenges. Of course, specialists will be required, but they will also need to broaden their knowledge to cover other IT technologies like cloud, AI, analytics, and operational technologies such as robotics and process automation that keep factories and assembly lines operating. To do so, train the people, develop continuous learning programs as a regular practice, hire people with good knowledge who are needed to bridge the gap, and have new people on board to help cross-learn and motivate the team. This is to unlock their potential to create a sustainable workplace.

4. Change Your Vision and Transform Your Company Culture

The force behind the “Digital Transformation” has become an extensive expression used across several industries and contexts to describe the process through which a company adopts and implements digital solutions that benefit its activities. It is crucial to notice the narrative around digital transformation as it enables a cultural shift in the company. Remarkably, how often do we discuss culture as a consequence of an event rather than as the driving force behind it?  This approach may still be missing in some essential aspects.

The advantage of a digital tool, no matter how good it is or what benefits it offers, will be lost if the company is not prepared to handle it. The project’s true potential is hidden, resources are wasted, and it is on the brink of failure. Aligning business with right operational practices denote the company culture, key factors that facilitate digital transformation approach to drive cultural change, and shop floor employees’ engagement with the vision and road map makes it simple. This transparent process guides CDOs and digital leaders. As organizations prepare for and impose digital transformations, it is crucial to promote a culture where everybody is tech-savvy, and security is everyone’s consideration.

5. Finding the Right Partners/Vendors

Plan the scope of your business and align your goals with the company’s general strategy by selecting that the right vendor/partner for implementing the IR 4.0 applications plays a pivotal role. Start with pilot projects, validate results, and systematize the learning mechanisms initially to understand the scope aligned with your requirements. To achieve this, model projects and “best practices” should be promoted and invest in a digital learning technique.

Whether they recognize it or not, most production managers today are already in a race. It’s a culture to adapt and implement new manufacturing systems and technologies; as we all know, integration and collaboration are at the core of Industry 4.0. Create a strategy before embarking on a long journey without a map, and you should do the same with Industry 4.0 adoption. This is a crucial step in the procedure. Once you’ve determined your desired maturity level, you’ll need to create a thorough implementation strategy to assist you in achieving your objectives.

A potential and effective partner will monitor your current functioning, detect traps, comprehend obstacles, and provide a healthy way to proceed or a new course of action. Ultimately, this will solve your existing challenges and assist you in generating new value from them. In the usual approach, they should give you a road plan for pushing your business to the next level. And they should let you comprehend it in a logical fashion without a slew of technical jargon.

Conclusion

Manufacturers and the manufacturing industry as a whole are seeking direction as we approach 2022, not least due to the ongoing global COVID-19 pandemic. However, the last few years have given us much learning that signifies resilience, innovation, and the sector’s ability to persevere in the face of adversity.

There are numerous opportunities for all industrial sectors, ranging from acquiring fresh talent to exploiting data more effectively to contribute to a more sustainable world. Smart Manufacturing and Industry 4.0 solutions and efforts will always be vital in manufacturing and many other sectors.

Are you looking to know how your industry will change in the era of Industry 4.0 and want to be a winner in this new world? Listen to our industry leaders and experts and ask your questions at our virtual event on “Demystifying Industry 4.0”. Our speakers are sharing real-life use cases and insights into opportunities that are driving their growth and success with Industry 4.0. Go ahead, Register Now – this is going to be a great event!

Stay tuned for more information!

Assessing Top 5 Challenges of Implementing Industry 4.0!

Today, the entire world is grappling with the COVID-19 pandemic, which has intensified supply chain concerns and prompted many businesses to rethink their sourcing strategies. Several businesses are focusing on localization for two reasons: one, to be closer to the source, and the other, to minimize the risk of disruption. In the case of manufacturing, the movement of Industry 4.0 is caused by volatile market demands for better and quicker production techniques, shrinking margins, and intense contention among enterprises that are impossible without potential technologies like AI, Data Analytics, and Cloud. However, SMEs and MSMEs are still struggling with several challenges in adopting Industry 4.0 initiatives. These obstacles may dissuade some manufacturing companies from adopting Industry 4.0, causing them to fall behind their peers.

The Top Five Challenges!

SMEs and MSMEs still experience difficulties achieving Industry 4.0 goals, although smart manufacturing is often associated with Industry 4.0 and digital transformation. Here are the five challenges:

1. Organization Culture:

This is one of the immense challenges for any organization to evolve from ad-hoc decisions to data-based decision-making. Part of this is driven by the data availability and the CXO’s awareness and willingness to adopt new Digital technologies. Navigating the balance between culture and technology together is one of the toughest challenges of digital transformation.

2. Data Readiness/Digitization:

Any Digital Revolution succeeds on the availability of data. Unfortunately, this is one of the most significant opportunities for SMEs. Most of the manufacturing plants in SMEs lack basic data capture and storage infrastructure.

Most places have different PLC protocols (e.g., Siemens, Rockwell, Hitachi, Mitsubishi, etc.), and the entire data is encrypted and locked. This either requires unlocking encryption by the control systems providers or calls for separate sensor or gateway installations. Well, this is a huge added cost, and SMEs have not seen any benefits so far, as they have been running their businesses frugally.

3. Data Standardization and Normalization:

This is a crucial step in the Digital Transformation journey, enabling the data to be used for real-time visibility, benchmarking, and machine learning.

Most SMEs grow in an organic way, and there’s an intent to grow most profitably. Typically, IT and OT technology investments are kept to a bare minimum. As a result, most SMEs are missing SCADA/MES’S systems that integrate the data in a meaningful way and help store it centrally. As a result of missing this middleware, most of the data needs to be sourced from different sensors directly or PLCs and sent via gateways.

All this data cannot be directly consumed for visualization and needs an expensive middleware solution (viz., LIMS (Abbott, ThermoFischer), and LEDs- GE Proficy); this is again an added cost.

Additionally, the operational data is not all stored in a centralized database. Instead, it is available in real-time from Programmable Logic Controllers (PLCs), machine controllers, Supervisory Control and Data Acquisition (SCADA) systems, and time-series databases throughout the factory. This increases the complexity of data acquisition and storage.

4. Lack of Talent for Digital:

Believe it or not, we have been reeling under a massive talent crunch for digital technologies. As of 2022, a huge talent war is attracting digital talent across all services, consulting, and product-based companies.

As a result, we don’t have enough people who have seen the actual physical shop floor, understand day-to-day challenges, and have enough digital and technical skills to enable digital transformation. A systematic approach is needed to help up-skill existing resources and develop new digital talent across all levels.

5. CXO Sponsorship:

This is a key foundation for any digital transformation and Industry4.0 initiative. Unless there’s CXO buy-in and sponsorship, any digital transformation initiative is bound to fail. For CXO’s to start believing in the cause, they need to be onboarded, starting with just awareness of what’s possible, emphasizing benefits and ROI as reasons to believe.

Once there’s a top-down willingness and drive, things will become much easier regarding funding, hiring of technical talent or consulting companies, and execution.

Final Takeaway

It should go without saying that the above stats do not include all the challenges manufacturers encounter when they embark on the Industry 4.0 journey. Additionally, more industries and professionals should actively engage in skill improvement initiatives for immediate implementation and prepare employees for the future. Industry 4.0 is more than a vision of the future of manufacturing; it’s a blend of potential technologies, processes, and business models that will create new ways to make anything at a previously impossible scale.

What impact would that have on Manufacturers? Who stands to benefit from Industry 4.0? How is it going to be the case? How will it begin in earnest?

Compelling insights will be unveiled at the Demystifying Industry 4.0 event on 13th May 2022.

Click Here to Register Now!

Stay tuned for more information!

The Future of Monetization with Web 3.0

As of April 2021, the global gaming value exceeded $300 Billion. While a substantial chunk of it is due to the rapid adoption in the recent pandemic and the availability of a plethora of mobile games, gaming is no more a child’s play. A sustainable, long-term monetization strategy can’t just be an afterthought but an essential building block for gaming, media, and entertainment businesses.

It’s 2022, and we are entering a new dimension with bleeding-edge innovations across industry verticals. The line between gaming, the media and entertainment is getting thinner by the day. Seamlessly integrated experiences may have sounded fancy a couple of years back, but now it’s a bare essential. 

We’re about to enter the next generation of seamless connected-tech experiences, thanks to Metaverse. It’s time for content-based enterprises, from small indie studios to large AAA studios and OTT platforms, to reconsider their monetization strategy.

The Current State of Monetization 

Monetization in games, media, and entertainment has evolved over the decades.

Games were sold as finished products in cartridges and then on physical discs. They used to be one-and-done products for which the customer got the most for their spend. 

With the shift towards digital game stores like Steam, Epic, GOG, and others, the monetization landscape saw a transformational shift.

For once, studios and publishers saved a lot of money since they didn’t have to print, package, and transport the physical game copies. They could also rely on fixing game bugs and patching them via Over-The-Air updates post-release, and then there’s Downloadable Content.

All these had a budget constraint and limited the possibilities for smaller players.

However, the paradigm shift in monetization came with mobile games. They brought in a shock and awe effect by leveraging the freemium models and using in-app purchases, in-game ads, and paid games to bypass the ads and paywalls compared to free games.

In media and entertainment, the advent of streaming platforms saw a similar dynamic, and the typical commercials saw a shift. Traditional media houses were left behind with a significant gap as OTTs ruled the roost.

But now, we are about to witness another shift that will affect the current monetization strategies of the gaming, media, and entertainment industries.

The New Era of The Internet is Changing Everything

We have too many digital game stores and streaming platforms, and the users are starting to feel the pinch. Subscription fatigue is starting to set in, making monetization challenging for businesses.

On top of that, we’re on the verge of a new version of the internet, one where entertainment will be an intertwined concept tag-teamed with gaming. Metaverse will be at the helm, and digital transactions and monetization methods will witness a rapid transformation. Saurabh Tandon, President & Board Member at Affine, recently shared his thoughts on this.

We now have blockchains in the mix, which can change the whole economy of monetization for both creators and businesses. Like it or not, the metaverse might very well become a crucial player in the world economy.

Physical and Virtual Lives will Bridge for a Unified Experience

XR (Extended Reality), an amalgamation of Virtual Reality, Augmented Reality, and mixed reality technologies, will pave the way for our future entertainment content requirements.

So, what does this mean for businesses? Big Tech giants rule with an iron fist, and content moderation is a grey area in the current climate.

“With the next generation of the internet, we are looking at decentralization and a leap of technology,” said Christopher Lafayette, Founder and CEO at Gatherverse, when he recently spoke at a virtual summit.

Advertising has already changed since its inception, and today it’s focused on content creators and consumers.

With content creators and influencers, advertising has taken center stage and helps ads find takers among the form of consumers with their large subscriber base.

Non-Fungible Tokens (NFTs) are now trending and are set to be a digital form of payment, letting users buy and trade digital assets. With users creating communities for such “digital marketplace,” the playing field for monetization is in dire need of an update and can’t rely on traditional practices.

What is the Future of Monetization?

The future of monetization with web 3.0 may be questionable, but the majority of web3.0’s focus is on decentralization. User is at the core; and will be the driving force, be it content, ads, or monetization. Adaption is the need of the hour for businesses.

Sure, the traditional payment methods will remain. But businesses have to acknowledge the fact that blockchain will be thrown into the mix and change the dynamic of the digital economy. Rafael Brown, CEO & Co-Founder at Symbol Zero, who was a speaker in a recent tech symposium, said, “PC and Mobile gaming have established a monetization economy. As technology changes with time, we need to revisit our assumptions. The need of the hour for blockchain technology is to create sustainable monetization.”

The tech summit brought together more than 20 world leaders from Gaming, Media & Entertainment to participate and unravel the direction we as humans powered by tech are headed with web 3.0. With discussions on monetization, metaverse, subscription fatigue, OTT platforms, and many more interesting topics, the virtual event was a hit around the globe. 

Watch the enticing session recording here

Affine combines the hyper-convergence of AI, data engineering & cloud with deep industry knowledge in manufacturing, gaming, CPG, and technology. Affine demonstrates thought leadership in all relevant knowledge vectors by investing in research through its highly acknowledged centers of excellence and strong academic relationships with reputable institutions.  

Reach out to us to put a robust and sustainable monetization strategy in place for your business!

Follow us on Medium & LinkedIn to get regular updates!

Cloud Analytics to Improve the Clout of Indie Games

Indie games were once for a niche crowd that enjoyed the retro-styled game design and mission progression. Low-key passionate developers who made the quality-of-life mods for popular games, or some brave heart developer who would try to cook up a game with an archaic code and a sub-par computer, made indie games back then. Even if someone managed to make an indie game with the minimum resources, there would be no buzz around the game since no major studio entertained such games.

Today, the Indie game scene is thriving and mainstream, with a sizeable audience. A lot has evolved to an extent that the term “Indie games” has been redefined.

Games like Hollow Knight, Terraria, and Among Us weren’t just chartbusters; they made the Indie space a mainstream one with a solid audience and substantial foothold in the market. What further echoes the success of Indie games is the fact that their development requires a significantly lower budget compared to mainstream AAA, RPG, and Sports games. Their pricing is competitive, aimed at garnering more takers and relying on mass sales volumes. Raking high revenue with a low budget is a guaranteed blueprint for maximizing ROI. But with a limited budget, Indie developers have a set of daunting challenges developing and releasing games in the long run.

The Rise of Indie Games

Many factors have contributed to the rise of Indie games.

The unsung hero here is the digital download platforms powered by cloud backend that make game downloads easier and more economical while helping indie games foray into the average gamer’s library.

Unlike large studios, Indie developers fell short of the budget to churn out physical copies in copious amounts, restricting the reach and exposure of their games no matter how good.

With Steam, GOG, EPIC, and other digital storefronts backed by cloud infrastructure and data centers worldwide for local downloads, it is just a matter of uploading their games to the cloud and letting gamers across the world download them.

Digital platforms became mainstream, and indie developers didn’t have to worry about manufacturing expenses owing to making physical game copies eating into their profits.

Indie Marketing – A Challenge That Cloud Can Address

We see a regular churning out of tons of indie games. The segment is one of the most competitive. For successful indie titles like Celeste and Minecraft, hundreds of games go unnoticed.

Multiple factors are at play here.

Marketing is one key aspect where Indie studios lack the resources to reach a global audience. Like with large studios, volume-based marketing may be impossible for small studios, but effective data-based marketing is not only possible but imperative.

Efficient targeting is the lifeline of successful marketing. Bombarding social media platforms with ads without a proper marketing plan will not result in user acquisition or brand engagement. You must understand your target base before going all out on digital marketing.

Marketing and user acquisition in this day and age is an omnichannel affair. Video games have a global audience, and studios cannot afford to overlook this factor.

Improving the reach of games requires a well-crafted marketing plan that covers all grounds.

Analytics-based marketing is the core to ascertaining the audience base for game studios. The keyword here is data-collecting data at every avenue, and online touchpoints are vital to understanding customer behavior and patterns.

But it is also the first step toward the logical marvel of data-powered analytics

Data Collection, Analytics, and ROI (Return on Investment)

With the advent of digital marketing, the term “marketing” is thrown around like confetti, and small, less experienced businesses like indie game studios find it challenging as much is lost in translation.

Indie studios must focus on collecting vital data like game metrics to understand their players’ behavior and data from social media platforms that can help them tune their marketing campaigns for maximum efficiency.

Combining the accrued data with cloud analytics helps studios with ROI-based marketing, giving them a real-time view of what’s going on with their marketing activities instead of a “what went wrong?” meeting.

This way, real-time tweaks to the digital marketing efforts are made possible, and studios can focus on targeting prospective leads on particular social platforms rather than unnecessarily advertising on every platform and user.

Since indie studios have a limited marketing budget, effective ROI-based marketing is the ideal balance to help achieve brand recognition and user acquisition.

Game Analytics – A Must-have Solution for Indie Game Studios

We saw the advantages of cloud-based analytics solutions for marketing. Now, let us look at game analytics that can help indie studios tweak their games and release updates and patches that result in increased player engagement and game lifetime overall.

Game metrics can help studios offer users a dynamic gameplay experience.

Player fatigue, monotonous gameplay, and challenging levels play spoilsport in the long run and affect the gameplay duration of players. DLC and additional content can help here, but it involves resources like budget, development, and workforce.

With game analytics, studios can read real-time game metrics, analyze them, and improve gameplay so that players will put more hours into the game. All this comes without worrying about the additional development cost and human resources for DLC.

There are plug-n-play game analytics solutions that overcome the infrastructural challenges and initial setup costs. Even the maintenance is done on the service provider’s end so that studios can focus on improving the player gameplay experience, the true intent of these solutions rather than logistics.

Indie studios can avail such game analytics solutions on a Netflix-like subscription model from pioneers in the analytics industry, with impressive track records and pedigree working with industry giants.

Cloud Solutions for Indie Game Studios are Not Only Imminent, but Imperative

Even the Silicon Valley giants have acknowledged that the cloud is paving the way for gaming and have introduced multiple cloud-based gaming services.

In 2020, Google introduced a managed service program called Game Servers. Unlike Stadia, Game Servers isn’t a game streaming service but a backend cloud server infrastructure that helps game developers build and scale backend servers for their game titles. So, unlike the common misconception that the cloud in gaming only means streaming, the cloud also plays a vital role in acting as a backend infrastructure for all types of games.

With a considerable stake in gaming and its recent acquisition of Activision Blizzard, Microsoft has revealed its new product, ID@Azure, which lets indie game developers develop their games from scratch for the cloud platform.

Large studios have already made this transition and are reaping its benefits, not only in terms of cost but also in the elimination of managing the backend infrastructure. Why spend time and resources on it, when many service providers like AWS (Amazon Web Services), Microsoft Azure, Google and others can take care of it while developers adapt to the subscription cost model and focus only on making quality games?

A German indie mobile game studio, Coldfire games, eliminated its backend management efforts due to adopting an efficient cloud infrastructure. It is the only time before other indie developers follow suit and adopt cloud infrastructure as a backend to their games.

Then we have the cloud gaming phenomenon, which is set to gain significant traction over the next few years, with players like Netflix getting into gaming and giants like Google and Nvidia having services in place. Just like streaming became mainstream, cloud gaming will set a benchmark in mainstream gaming, and developers need to adapt to this, providing games at a subscription model.

From marketing to game analytics and backend infrastructure, cloud solutions are highly beneficial to game studios, particularly Indie developers who don’t necessarily come equipped with the resources, team, and technical know-how for game marketing and analytics. But it can be challenging to choose the right ally for your game analytics requirements. Affine is a pioneer player in the AI analytics arena, working with large and indie gaming organizations to synthesize game analytics into efficient and effective business outcomes

.Get game analytics for your business

The passion for developing exceptional games is all an indie studio needs, for cloud-based solutions are plenty with simple implementation techniques and measurable ROI-based results that will assist them in every step.

Follow us on Medium & LinkedIn to get regular updates!

Cloud’s Role in the Rise of Gaming

Gaming is one of the fastest evolving industries, with considerable technological advancements. We’ve come from retro arcade games to LAN parties and playing portable smart gadgets on the go.

In terms of graphics, we saw the evolution from pixel art to 2d, and then 3d models. While PC has always been able to do justice to the visually exquisite-looking titles, current-generation consoles have changed the dynamics of the graphics race. Ultra-realistic graphics with reflective surfaces and interactive environments give users an immersive gaming experience. All this before even dipping toes into Virtual Reality, which is yet to hit mainstream status.

The pandemic witnessed a sudden spike in the interest in gaming. People flocked to play games, which served as an interactive pastime compared to streaming shows and movies. The aftermath? The gaming industry raked in revenue of $156 billion as of September 2021, and the global video game market is forecasted to cross $200 billion as of 2023!

Cloud is the Backbone of the ‘Always Online’ Culture

The lack of proper infrastructure in the previous generation limited gaming options to local play, offline games, and LAN parties at best.

Massively Multiplayer Online games were a pipe dream back then. But with the turn of the millennium, things changed. 

Small data centers became multiple server farms with global CDN (Content Delivery Network) for maximum scalability.

Multiplayer games have gained significant traction over the years and are now a norm. Today we have a plethora of multiplayer online titles, with single-player games providing multiplayer options for players who want to explore beyond the original storyline.

Titles like Rainbow Six Siege made bank for Ubisoft with lifetime sales of over 1.1 billion as of 2021, with a player base of 70 million as of 2021.

The cloud infrastructure in place for a feat like this speaks for itself, which would have been impossible a decade back, simply owing to the lack of tech infrastructure.

Single-player games significantly depend on the cloud infrastructure thanks to the paradigm shift from DVDs to digital game stores like Steam, Epic Games, and GOG. Pre-loading games, cloud saves, day one updates, and DLCs are standard practices in the gaming industry now. The days of waiting in long lines for days before the next GTA release is a thumbnail in history. With millions of players downloading at a time across the globe, sustainable cloud infrastructure is at the heart of gaming infrastructure.

Gaming on Demand will be a Reckoning Force in the Future of Gaming

Gaming on-demand or gaming as a service is growing by the day and will shape the path of gaming as it did with content consumption via streaming. Steam link, Nvidia GameStream PS4 RemotePlay and many such services offer gaming to the end-user on reasonably fast internet. Gaming hardware bundled with free games and discounts still comes with the invisible baggage of limitations owing to short cycle yearly tech upgrades in the gaming industry, nullifying the economy factor in gaming. 

Developing games for multiple platforms is also an arduous, time-consuming task for game developers, which results in inconsistent gameplay experiences for the player. We’ve seen releases like Watchdogs, CyberPunk2077 looking graphically inferior in their release versions compared to the announcement versions. While many factors are at play here, the challenge to develop games for a previous generation platform alongside the next generation of consoles with high-end hardware causes compatibility issues in the game build and adds to the development cycle. Gamers also have to wait a long time since the release dates keep getting pushed to accommodate fixes to the build.

The high upfront cost for purchase and scarcity of vital hardware components like storage, RAM, and Graphic Cards create an opportunist and hobbyist culture in gaming. Thankfully, the advent of Gaming on-demand will address the issue. While it may take some time for global adoption, the end result not only provides every gamer the opportunity to high-speed gaming, game makers get access to the end-users without the hindrance or hardware barrier.

Cloud Gaming will be the Netflix of Games

While Netflix is testing waters and entering the gaming domain with mobile games, cloud gaming is an underlying phenomenon that will transform the industry and cement its position as the next chapter in gaming.

Stadia may have fallen short of wowing the gaming community, but there are perils to being an early adopter. For one, the number of mobile gamers in recent years has seen an alarming spike thanks to PvP games like PUBG and Fortnite captivating the masses and converting non-gamers into serious, habitual gamers.

This opens the doors for cross-platform play, which is currently a pipe dream with rare working examples, but all this will change with the cloud technology at the center stage.

With cloud gaming at the helm, the industry envisions a platform-agnostic gaming ecosystem powered by high-speed internet, primarily relying on robust cloud infrastructure for inclusive, sustainable and affordable gaming.

Currently, there is a global hardware draught from the pandemic’s disruption of manufacturing & supply chain, in addition to scalpers grabbing available stocks. This is a testament to how the dependency on hardware for gaming has reached a saturation point.

By eliminating the storage and graphical requirements that are a roadblock for many aspiring gamers, cloud gaming (gaming on demand) brings the biggest USP to the table for gamers, developers and studios. Gamers don’t have to break their bank going on a hardware shopping spree for next-gen graphics or order terabytes of high-performance Solid-State Drives for games that cross the 100GB mark. 

Developers don’t have to fret over the game’s performance fidelity across multiple platforms or downgrade the graphics so that players can achieve decent frame rates across different types of devices. There’ll also be a significant reduction in the development period, which could help release games as per the dates advertised!

Studios can increase their target audience from hardcore gamers to even new players since the hardware barrier is no longer an issue. A feasible subscription gaming model will onboard a significant number of new gamers, and studios are looking at a rise in their user base with an imminent increase in ROI and a sustainable business model.

Cloud gaming emphasizes safety which is vital in online gaming. Player information leaking on the internet is nothing new, and even the top-of-the-crop studios like Bethesda have had this misfortune with Fallout 76. Cloud-based gaming models have bulletproof online security, which reduces the chances of player databases getting breached by external attacks and information leaks across the web.

Conclusion

Spending for modern hardware every 2-3 years at inflated astronomical prices in the current economy is not practical for players. Not to mention the pile-up of hardware junk which is not sustainable for a green future. A subscription model, in the long run, is a better alternative.

For businesses, the cost advantage is obvious. Setting up new infrastructure that requires frequent upgrades is a messy and costly affair that entails enormous resource consumption. The big cloud players offer these services for a fraction of the cost with unlimited scalability options and an ‘only pay for what you use’ model. 

Cloud is not all hype. We’ve seen its role in streaming, and the technology has shaped up the OTT platforms today, providing access to quality content within the press of a button to users across the globe. The question is how long until it becomes mainstream in the gaming space. Sure, hardware-based gaming will not go extinct. But a shift towards cloud gaming is imminent. 

Cloud and gaming go hand in hand. If the present is anything to go by, the dependency on cloud is only going to go up with time. 

Follow us on Medium & LinkedIn to get regular updates! 

Evolution Of Human Resource In The New World Of Technology How has the Human Resources changed with time?

Of all the departments and functions in a corporate organization, Human Resource is the one function related to employees’ personal aspects. The entire employee job cycle is taken care of by the Human Resources (HR), ranging from hiring, compensation, leave management, employee satisfaction, development, and growth till exit. This function requires personal involvement and conscience that may vary from person to person.

Another rather unique aspect of an organization – technology is the practical and scientific application of various aspects such as skills, processes, methods, and organization techniques. It does not concern any personalized features and derives the same result irrespective of who, where, or when someone brings it into the organizational process.

But what if these two aspects are related?

The challenges related to HR, like Employee Engagement, Employee Retention, development of the leaders, competitive compensation, global outreach of businesses, and various other factors have stimulated extensive innovation in the HR field. For instance, more than 92% of the recruiters have turned their trust into Social Media hiring in the recent decade rather than organic hiring methods. More than 3% recruiters use “Snapchat” as a recruiting channel, moving beyond LinkedIn, Facebook, and Twitter. Below are some of such instances that bring HR and Technology together.

HR and Virtual Reality

COVID-19 has undoubtedly helped change the mindset of protagonists across the Corporate industry, especially in India. Holding appraisal meetings, taking interviews, onboarding, and even celebrations as part of work today takes place over video calls. Technology and Virtual Reality help HR with talent management, training, onboarding, and inductions, hiring, etc. as the new normal.

HR and Machine Learning Machine Learning (ML) uses algorithms for automated data analysis to create automated analytical models. HR deals with massive data sets from Recruitment and the Employee Database. ML technology helps HR improve the efficiency of initial research with dedicated hours to acquiring next-level results. So far, in Human Resources, machine learning

applications are confined mainly to the Recruitment process. However, it will be exciting to oversee the advancements in this field.

HR and Could Computing Cloud Computing ensures using a network of remote servers hosted on the internet instead of a personal computer or a local server. It helps data processing by storing and managing valuable information over the cloud, enabling the HR department to push its expertise into the middle and higher-level leadership, resulting in efficient business performance and execution. When the data on performance, attendance, track of time, etc. gets automated, the focus can be shifted to increasing productivity, transforming the HR department from being a cost center to generating revenues.

The mentioned instances are only some broad spectrum of Technology inter-dependencies in the HR department. The ever-changing and fast-paced technological advances are only making HR strive towards innovation that ultimately makes it even more indulged with technology.

Significantly, the global pandemic has altered stigmas helping people become adaptive. Several theories believe that remote working can continue as the “new normal” once we overcome this pandemic.

It can make the who’s who of the Corporate world, refine the entire workplace experience with HR as the bridge that connects extremes in an organization, exposing and expanding them with the latest technological developments.

It will be interesting to see how the Corporates fit into this new reality.

Are Streaming-services like Stadia the future of Gaming?

1. Introduction

Uber has revolutionized the way of commute since its launch. Traveling short distances has never been hassle free. Earlier people used to use their personal vehicles to cover small distances. Other alternative was to use public transport which is time-consuming and inconvenient. Uber, on the other hand, provides flexibility to non-frequent traveler and ones who love commuting over shorter distances, as they do not have to spend on purchasing a vehicle and at the same time can move around very conveniently. The same might hold true for the future of gaming! What would you feel if technology giants like Google and Amazon-owned the expensive hardware to process games with the best possible CPU and GPUs allowing you to simply stream the games? This could potentially eliminate the need of purchasing an expensive console and pay in a proportion of usage! This could be a game changer especially for someone who has not been able to commit to a INR 30,000/- console to play a single game. Can the entry of Google and Amazon in the gaming industry make this possible? At the Game Developers Conference (GDC) 2019, Google unveiled its cloud-streaming service called STADIA. Just like how humans have built stadiums for sports over hundreds of years, Google believes it’s building a virtual stadium: Stadia, to foster 1000s of player to play or spectate games simultaneously interacting with each other. Free to play games like Fortnite will standout on Stadia if Google can increase the number of players participating in an instance from 100 to say 1000s. Would Stadia really live up to its hype is a tricky question that only time may answer.

2. How does it work?

Google will make use of its massive data centers across the globe that will act as computational power for this service. Massive servers will make use of its advanced CPUs, GPUs, RAM and ROM to render games and stream to the users the enhanced audio/visual outputs. The players’ input shall be uploaded via keyboard or custom Stadia controller directly to the server. Let’s look at how Stadia stands against conventional console-based gaming.

3. Comes with advantages over console-based gaming

3.1. No Hardware (other than a remote): The bare minimum piece of hardware required is a device that can run chrome like a laptop, PC, mobile, tablet or even smart TV.

3.2. No Upgrade costs as they are taken care of, by the shared infrastructure hosted by Google. In the recent past, we had games that were below 10 GB in size while the recent RDR2 was above 100 GB with its patches. One can imagine how the need to upgrade hardware is the biggest driver for upgrading to next-gen consoles.

3.3. No RAM/ROM or Loading time limitations: Apart from these, YouTube integration will enable users to live broadcast their gameplay and will allow others to join as well in case of multiplayer games. In addition, the google assistant present on stadia controller will provide immediate help in case one is stuck at some point

of time to clear the stage. The benefits of this concept are really promising. But will the drawbacks offset these promises? Let’s go through each of them.

4. Need to overcome challenges to expand at scale

The drawbacks can potentially be addressed over time, but for now, scaling this remain the biggest hindrance. There are various challenges that Google (and users) will face such as Latency, Pricing, Markets and Game Library. There are other pointers as well, but these are going to be the biggest ones.

4.1. Latency effect

The video footage must get to you and the controller inputs must get from you to the server. Hence it is obvious that there is going to be an extra latency. Latency will depend upon three elements:

– Amount of time to encode and decode the video feed: Google has tons of experience in the field of video feed under the likes of YouTube

– The quality of internet infrastructure at the end user: This worrisome problem will hinder the smooth conduct of this process. The internet speed will be good in tier 1 cities, and not necessarily in the rural areas. You will also need a data connection without any cap. As per google, a minimum speed of 25Mbps will be required to bring Stadia into function. This means 11.25 GB of data will be transferred per hour. That’s about 90 hours of game streaming before the bandwidth is exhausted, considering that the user has a data cap of 1 TB. In other words, 3 hours of gaming per day in a month of 30 days. This is under the assumption that there is only one user and is utilized only for gaming purpose.

4.2. Dilemma for developers

Above was the issue that the end user will face. Let’s look at the situation from the game developer’s perspective. With the advent of a new platform, the developers will have yet another platform to port and test games. The developers will have to do more research which will increase the cost of production. At the same time, more time will be required to release game. This will be a big challenge for franchises that launch games every year. Google has partnered with Ubisoft and has promised to feature Ubisoft games at launch. The time will tell how many more developers will be willing to go a step ahead to support this concept. If not, then this could potentially mean that a lot of games will not be available ever. Now from a consumer’s perspective, it will be hard to justify their purchase as they won’t be able to play all the games available in the market.

4.3. Optimal pricing

Another challenge will be pricing. There is no information regarding the pricing of the overall model. Is this going to be a subscription service? Do we have to buy games? How the revenue is going to be shared with developers? Will the pricing be the same for hardcore gamers and casual gamers? Consider Activision (developer of games like Call of Duty) for example. Historical analysis tells us that slightly more than one-fourth of the purchasers do not even play the game for few hours. On the other hand, there are purchasers who play it day in and day out. The cost that each user has to pay for the game is $60. This amount goes to Activision and the platform on which it is sold. In case, Activision decides to release the game on Stadia, all the casual purchasers who would have bought the game to test out the hype, would now just stream it on Stadia at a much lower cost. Will Activision take that chance and release the game on Stadia? In case, the pricing is different for the types of users, how will the revenue be shared with the developers? Let’s assume that this will be a subscription model and users will be charged $30 per month, which comes out to be $360 per year. Now for a casual gamer, this will be very high as he can buy a console for $300 and play for years. All these questions will have to be answered before the launch. Running a cloud gaming service is

expensive. If the whole selling point is making gaming accessible to more and more people, then a high price point is not going to help the cause.

4.4. Available markets

At the GDC event, the team said that the service will be available in the US, Canada, UK, and Europe at launch. These regions have a high penetration of console-based gamers and Google will have to make a lot of efforts to make these people switch. The penetration of PlayStation and Microsoft Xbox is in single digits in India or China. With Stadia not available in Asia, Google is missing a lot of developing countries like India and China where people are not inclined towards consoles and hence hampering its user coverage. Given the high cost of consoles in developing countries like India, Stadia can become the go-to gaming platform.

4.5. Array of games available

Games library will be another hurdle in the race. We have no information regarding the list of games available during launch. Third party support isn’t enough for a gaming platform to survive. You need a list of exclusive games to bring people aboard. Google even unveiled its own Stadia Games and Entertainment studio to create Stadia-exclusive titles, but it didn’t mention any details on what games it will be building. In addition, it is highly unlikely that Console exclusives (1P titles) like Spider-Man or Halo will be available for Stadia. 1P games play a significant role in the console sales and Sony and Microsoft will never let this happen until they stick to console-based gaming. So, Google will have come up with its own exclusive titles so be dominant in the market. Making exclusive games takes a lot of research and time. It took Sony a good 5-6 years to develop one of its best-selling game “God of War”. If Google has not already started on its exclusive games, then it would be a mountain to climb for them.

4.6. What about other browsers?

Stadia will only be available through Chrome, Chromecast, and on Android devices initially. There was no mention of iOS support through a dedicated app or Apple’s Safari mobile browser. Will Apple be comfortable to let its user base shift completely to Chrome from Safari? Will Apple charge Google additional money for the subscription that Google gets on Apple’s devices? All these questions will be answered over time.

4.7. What if…?

Last but not the least, in case Google decides to drop the idea of Stadia in the later years of its launch like it has done in the past with Google lens or google plus, then gamers will lose all their progress and games despite their subscription fees. Apart from the above drawbacks, Google is not the only company to step in this field. It already has some serious competition from existing players in the game streaming sector.

5. Any competition that Google might face?

Sony already streams games to its consoles and PCs via its PlayStation Now service. Microsoft is also planning its own cloud game streaming service and can leverage its Azure data centers. Also, both Sony and Microsoft don’t require developers to port their games for their cloud streaming service. Apart from these two players, Nvidia has been quite successful in this domain allowing users to stream games from its library. This means Google has some strong competition and looks like the cloud gaming war is just getting started.

6. Conclusion

What is the incremental change you get from one version of a device to another? It is the absolute bare minimum they can give to make people switch. Let’s take an example of PS4 slim and PS4 pro. The only difference is that Pro supports 4K while Slim doesn’t and we have seen 30% people switching from Slim to Pro. The entrance of Google into the gaming industry will make PlayStation better, it will make Xbox better, it will make internet infrastructure better. The success or failure of Google stadia will cost nothing to consumer and at the same time, it will be net positive to gaming industry as well.

Thanks for reading this blog, For anyfeedback/suggestions/comments,
please drop a mail to marketing@affine.ai

Contributors:
Shailesh Singh – Delivery Manager
Akash Mishra – Senior Business Analyst

Manas Agrawal

CEO & Co-Founder

Add Your Heading Text Here

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged.