By Gary Bhattacharjee, AI Practice Lead, Infosys.
The power of personal interactions in conducting a business transaction has often been taken for granted, till the COVID-19 pandemic of 2020 shut-down all physical touchpoints in global businesses across all verticals forcing leaders to re-think business norms. Let’s explore the factors that make human interactions in a business-to-business (B2B) context complex and how the tenets of Artificial Intelligence (AI) alleviate it.
In a remote work environment, ‘teaming’ becomes difficult without physical interactions. While plenty of virtual meeting technologies existed prior to the pandemic, the lockdown has forced this to become the ‘only’ method for collaboration, thereby opening doors for innovations, that can scale and improve the existing capabilities.
Using Neural-Nets techniques in AI, auto-minutes can be created by employing voice-to-text techniques in virtual meetings. In addition, it can auto-create a transcript for analysis, traceability, and reference.
Using transfer-learning techniques in AI, auto-translation of conversations are enabled in a virtual meeting that involves multi-linguistic participants.
In face to face meetings, we sub-consciously read body-language and design our responses. It’s very hard to do that on a video-call. However, gesture-analysis and micro-facial-analysis techniques can more accurately identify broad sentiments in comparison to a human sensibility.
Although paperwork needed for a business-to-consumer(B2C) transactions has reduced drastically in the digital world, the B2B interactions still largely rely on interpreting information from narrative documents, printed forms, and/or handwritten paper. The primary reason is that often the business rules for various transactions are unique and complex, and do not warrant computerization of the process.
Natural Language Understanding (NLU)
Advances in the areas of ontology as well as neural network based algorithms to synthesize context are being widely used to understand business documents, such as contracts. A NLU enabled platform can replace the need for human-handling of paper; thereby aiding in contactless business operations.
A lot is still captured in handwritten notes, for example, legal notes on contracts, clinical notes, and prescriptions, meeting minutes, etc. Advances in image processing, pattern recognition, and style classification has improved the accuracy and understanding of handwritten text. This helps in disambiguating human contact and building trust in handwritten communications.
What we take for granted in a ‘real-world’ is the authenticity of another human. However, a virtual world requires us to validate and authenticate the human identity of entities we interact with.
Human users can be recognized from the electronic breadcrumbs that they leave in the digital space. A bot may be able to mimic a human but using behavioral analytics, an AI-powered machine can identify a non-human. In addition, by gathering profile information from third party data sources, the system can not only authenticate the person, but also validate credentials and abrogate the need for human due diligence.
Third party validation
Notary services and escrows, were created to ensure that in a business transaction, a third entity validates an identity or transaction. But with advances in validations using digital signatures, bio-metric authentications, and other AI-powered techniques, notarizations can be made completely virtual, thereby eliminating the need for human notaries.
Block-chain technology establishes the authenticity of a transaction, a contract, or an asset in a business. Machine Learning techniques in AI can be used to create the necessary unique identifier for a block that is immutable but derivable from the business context.
In a virtual space, when it comes to security, some things that are taken for granted in a physical space, need to be discovered and managed, as the threats are often unseen.
In a virtual workplace, users are typically admitted via a multi-factor authenticated ‘gate’ and all communications are protected by encryption. However, these protection methods are subject to hacks. Advances in AI, helps in instantaneous threat detection and isolation.
A more insidious threat, is when an authenticated user engages in malpractice, as his/her actions are not monitored by peers as in a normal workplace. For example, a user can easily steal sensitive data from his/her remote computer screen, by taking a phone picture or simply jotting it down on paper. Techniques in AI using behavior analysis, can identify and infer such conduct, even without capturing actual signals of the same.
People working remotely miss the workplace conversations and interactions. In addition, the attention span of home workers is often frayed by personal triggers. The AI disciplines of vision analytics and video-intelligence, are an enabler for the following techniques:
Augmented Reality (AR)
AR juxtaposes virtual-objects from another real-estate to the real objects in a home-office, giving it the physical feel of an interactive workspace. Utilizing vision-intelligence techniques in AI, such altered reality can mimic an extended workspace that can keep the worker engaged.
Virtual Reality (VR)
In VR, the worker is completely immersed in vision, sound and, now touch, with the reality that is projected. In certain business workplaces, like a shop floor, an advanced VR system with co-workers is rendered as ‘digital twins’. It is essentially a complex composition and rendering of data and behaviors which are detected real-time from the analog twin, using the disciplines of AI.
While these augmented capabilities, powered by AI, mimicking and amplifying human interactions, can alleviate the ‘real-world’ engagement needs for a socially distant workforce, one thing still is beyond the reach of current advances in AI: Intuition. Is this sixth sense that a human feels, unquantifiable, or is it yet to be mathematically expressed? The perception of a successful outcome still tends to favor intuition-based decision making, due to the deep seated reliance on our gut feel, which over time perhaps we can overcome and trust machine-based controls. Intuition could just be an ephemeral and a fictional construct of the human mind or it could be a real thing which can be computed, mimicked and amplified. As AI continues to evolve, we will eventually build intuitive machines that we can trust, implicitly.