Sanjiv Jha AWS

Interview: Sanjiv Kumar Jha, Principal Solution Architect – Smart Infrastructure, Amazon Internet Services Private Limited, AWS India and South Asia

“Unavailability of the right software and skilled cloud architects is hindering adoption of Cloud Computing in the Geospatial segment.” – Sanjiv Kumar Jha.

This interview was originally published in AGI’s Jan-Feb 2022 Newsletter Edition on the theme of Land Administration. To download the full newsletter, click here.

 

With the huge volume of Big Geospatial Data and vast number of stakeholders involved in the digitization of land administration, the “cloud” appears to be the much-needed solution. And yet, Geospatial cloud computing is still in its infancy in India. Where is the gap?

There are two major challenges facing the faster adoption of cloud computing in the geospatial segment, and preventing it to scale. The first reason is the unavailability of the right software to process geospatial data on the cloud. The software to process these data are decades old and mostly desktop based. Vendors are now building cloud-optimized versions of the software, but this is still in a very early stage. One can still deploy the software but they are yet to evolve to take advantage of native cloud benefits of elasticity and dynamic scaling. So, the true value of cloud transformation is yet to be realized.

The second reason is general awareness and lack of availability of skilled cloud architects in the industry. Doing a lift and shift of the workloads from an on-premise deployment does not achieve the full value of cloud transformation, we need to have cloud optimized architecture to fully realize the potential, and skilled cloud talent is important here. The reality is that technology innovation is outpacing skills development. AWS understands this, and we are deeply committed and invested to developing a skilled talent pool in the country. We are doing this by – collaborating with education institutions to offer integrated computing curriculum in undergraduate and post-graduate degree programs at scale across states in India, training faculty and providing students industry-recognized cloud curriculum and access to AWS certifications through the AWS Academy program, and offering the AWS re/Start program.

 

Talking about various cloud deployment models (public, private, hybrid, multi-cloud, community), is there a single solution that offers advantages of security, accessibility, control, affordability, and compatibility at the same time?

A public cloud has the advantage of economy of scale, and over time, it has evolved as the one stop solution for security, accessibility, control, affordability and compatibility. AWS has been the world’s most comprehensive and broadly adopted cloud offering for over 15 years and is serving millions of customers globally.

At AWS, security is always our top priority. AWS has been architected to be the most flexible and secure cloud computing environment available today. Our core infrastructure is built to satisfy the security requirements for military, global banks, and other high-sensitivity organizations.

 

In contrast to the past, where data was scarce, today we have a multitude of resources offering Geospatial data in various formats, resolutions, and quality standards. Is the compatibility/interoperability of these different types of datasets mandatory for moving them to the cloud? If yes, how can organizations ensure this?

Compatibility/ interoperability is not the blocker for cloud adoption; there are emerging standards which make storing these large datasets on cloud efficient. One of the evolving standards is COG (Cloud Optimized GeoTIFF). A COG is simply a GeoTIFF (a public domain metadata standard) that has an internal organization that supports efficient access via HTTP, i.e., it is formatted to work on the cloud.

This internal organization is combined with an HTTP feature called ‘GET range request’ that allows only the portion of the file that is needed to be retrieved. Similarly, the Spatio Temporal Asset Catalog (STAC) specification provides a common language to describe a range of geospatial information, so it can more easily be indexed and discovered. While moving data to cloud, it is advantageous to think in terms of cloud optimized data formats for ease of discovery and sharing.

 

The AWS cloud platform itself includes a variety of cloud-based data storage options which companies have to choose from. Which of these solutions come across as the most befitting for Geospatial imagery processing and analytics, considering the dynamic mosaicking and on-the-fly processing requirements of land imagery?

AWS has many storage options which one can choose to architect their solution. We recommend a multi-tier storage pattern for geospatial image processing. While Amazon Simple Storage Service (Amazon S3) is the ideal choice for storing and building geo-data lake for large scale data, customers also can use Amazon Elastic File System (Amazon EFS) or Amazon Elastic Block Store (Amazon EBS) for storing and gaining faster access to operational and transactional data. This approach gives the most cost-effective storage strategy for large scale geospatial image processing.

 

With a conducive policy environment liberalizing the Geospatial sector, we will be witnessing a new era of business development powered by innovative Geospatial mobile applications, but challenges around scalability and infrastructure management of these applications continue to hinder startups and small businesses. Is there an affordable way forward for them?

Startups need to look at moving from capex model for infrastructure to the opex model. We believe the cloud is the best place to build startups, and that the AWS Cloud is the best possible environment for them to flourish. Cloud services are incredibly cost-effective and can quickly give startups access to computing resources as they need them on a pay-as-you go basis. Second, cloud is inherently meant for dynamic scaling, this isn’t a challenge.

At AWS, we offer a broad range of programs and initiatives to support startups at every stage of their lifecycle, from early-stage through to maturity. These programs provide startups with a host of benefits, including AWS credits, technical support, and training. We also offer extensive cost optimisation services to our customers. We’ve reduced prices 111 times since AWS launched in 2006. Our team members are actually tasked with reducing a startup customer’s cloud bill, ensuring startups make the most economically efficient use of our services.

Cloud infrastructure costs can represent a large portion of a startup’s regular expenditure, and our team has been able to achieve savings for customers of up to 40%. We do this by offering a range of pricing models, including spot pricing for Amazon EC2 Spot Instances, which startups particularly appreciate because they can deliver up to 90% discounts over On-Demand pricing.