There is a lot of excitement in the Decentralized Finance (DeFI) space. Can you comment on what you see as the particular opportunities there and what aspects you see as hype?
Much of the excitement is around wholesale disintermediation and reduction of transaction costs and thus opportunities abound for entrepreneurs and established corporations alike, to nurture payment, investment and financing products. DeFi protocols inherently allow creation of platforms for specific use-case and their variations. That is to say, there is a risk of creating overlapping, incompatible, erroneous or extraneous layers. That the space is unregulated, and transactions are irreversible mean that the widespread adoption depends on trustworthy protocol providers and the decentralized app builders. In some ways, this is counter to the promise of disintermediation, and threatens the entire field. From a larger perspective, this is setting a stage for more regulated or government backed cryptocurrency based platforms. I believe that the smart money would be on creating agnostic layers mirroring current DeFi protocols to facilitate mainstream use cases for a much larger audience.
We’ve had about a decade and a half of “Data” talk—from Big Data to AI+Data and so on. We’ve had a lot of clichés bandied about- Data is the new oil, the new gold, etc. You work on all aspects of Data—can you explain what the infrastructure and cultural requirements are for a sustainable data strategy?
I would argue that the data talk track has not necessarily contributed to how data is collected, organized or put to use. More data has not translated necessarily to more novel approaches to utility, productivity or efficiency. Today’s data technologies natively and commonly allow much faster, reliable and accessible processing of much more voluminous data; thus allowing the core business applications to continue to serve the business operations. The technology’s potential has been, however, underutilized when it comes to making sense of data and putting it to good use. Yes, many businesses have been able to automate certain actions and decisions based on vast amounts of data available. Yet, most businesses still have not been able to make data central to their decision making apparatuses. There are, in my opinion, two distinct causes and thus possible strategies. One, data is incomplete and fragmented, residing in multiple systems rarely collated, making it very difficult to build comprehensive perspective and thus leaving businesses to use time tested ways of using individuals’ past experiences blocking interdisciplinary learning and innovation. Possible remedies include modernizing business systems, building common data layers and culturally forcing a larger multi-faceted discussion. Two, most business professionals lack data interpretation skills and that has led to a rather constrained view of data limited to the purposes of visibility and control. Informed application of statistical tools, Bayesian decision making, critical thinking or data based evidence, would unlock potential of available data and generate a healthier need for broader and more meaningful data maturing the overall data infrastructure. Organizations would do well if they emphasize critical thinking, evidence based decision making, knowledge of statistics, and objective reasoning as part of their hiring and skilling portfolios.
A few areas that have received a lot of fanfare of late- Finance/Finserv, Healthcare, and Payments—all of these are in highly regulated areas. Can you walk us through what it means as a consultancy and provider to large companies in the space – how your solutions account for the horsemen- privacy, security, compliance, governance?
I would differentiate between the two aspects of regulatory impact—structural and operational. Structural regulations, such licenses, institutional frameworks etc., have led to certain rigidity, but as they change, a slew of options are opening up not just for consumers but for businesses as well, some of which are in a regulatory gray area. As we consult on technology implementations, we are careful of building any dependencies on platforms or services which might be impacted as regulations evolve. These may relate to localization, data residence, governmental access, mandatory record keeping, tax jurisdiction applicability etc. Operational regulations—such as requiring assurance of privacy, safety or compliance to procedural norms–require much deeper considerations as systems are designed, built, implemented and run. We aim for a ground-up compliance which means all platforms, all components and all processes relating to design, development and release, are in accordance with relevant regulations and standards. We have internal and external mechanisms for assurance and a high degree of awareness and sensitivity across the teams.
The world is abuzz with talk of AI and ML. How do you think about these categories in a practical fashion? How do you help apply these to the real life/real-time struggles of your customers?
AI and ML are ubiquitous now. It is fast becoming so intrinsic that I would consider most of the chatter around inconsequential. Practically speaking, AI/ML find their application in three ways. One is enabling use cases which were hitherto unviable or clunky. Think of image recognition, NLP, recommendation engines, etc. Most of these are now available as components which can be weaved into a comprehensive user scenario. Thus we offer chatbots, customer experience personalization, process automation etc., which are much more powerful as we integrate AI/ML based components within them. The other way is where AI/ML enhances the effectiveness of how technology has been used. Examples include predictive analytics, pattern recognition, log analysis and threat detection, performance enhancement (load-time, image processing, rendering etc.), operational monitoring etc. Our endeavour has been to ensure that any implementation benefits from AI/ML thus using the state of the art in the platforms and services stack. Lastly, AI/ML are very powerful tools in a data scientist’s arsenal. Ad-hoc modeling, data analysis, category identification and insight building are not just useful in specific scenarios but also as inputs to a larger business process or information systems design.