Showing posts from 2018

Free Tier Serverless Cloud Platforms

Last week I was trying to write an Amazon Alexa Skills (Earlier I had created Google Assitant App using dialogue flow ) and found creating Alexa app was not as intuitive as dialogue flow as a beginner. Probably it's just the UI/UX. There are much more incentives of creating Google assistant apps as it can be accessed via android phones right from your lock screen as opposed to Alexa skills which you need to use the echo devices or Alexa app on phone and keep it running in the background.

Nevertheless I needed a service endpoint which would feed data to my Alexa app and the recommended way was to have an AWS lambda function, however, you could also put any public endpoint URL of your own. I recently did a lot of work on Google Cloud Function so I opted for it. Tough the network latency may be marginally more because now Amazon data centre needs to talk to Google data centre. But the difference was smaller when compared to my home internet connection's latency to cloud.


GDG Devfest 2018 Hyderabad

It was the 7th year in a row Google Developers Group hosted their flagship event devfest in Hyderabad. This time it was big because it was not only a 2-day event with dedicated days for sessions and codelab/workshops, but also it accommodated more than 600 developers including professionals and students.

Lately, I have been working on Google Cloud Functions at work as it went GA during the GCP Next'18 earlier this year. I decided to submit a proposal around the same for this event. Slides and Code Lab instructions. 

While the proposal was accepted by the committee, it was moved to code lab instead of a session as it was more of a hands-on session and not much advanced for someone not who have not done cloud functions earlier.  While I was preparing the steps which I will ask participants to follow during the workshop, I realized there was a big problem in terms of GCP projects. Cloud Functions requires a GCP project which is billing enabled although it is free up to a few million…

PyCon India 2018 Day 2

This post is a continuation of the post I wrote after the end of the first day. The 2nd day started with the keynote from Travis Oliphant who is well known for Python projects like NumPy, SciPy, Anaconda and many more popular libraries. It was one of the best keynotes I have heard in recent times. It not only had technical aspects of Python language development but also a lot if important challenges developers face as a human being. Starting from work-life balance to disagreements on mailing lists over decisions on python language. He gave a lot of insights and hope that anyone literally any one can contribute to the open source projects from his personal examples. The thing which stood out in his keynote was that unlike many other speakers he gave examples from real life events on every abstract point that he mentioned. His keynote motivated me a lot especially to write more, read more and communicate more as a developer.

The session by Anand S on cleaning up data was on.e of the bes…

PyCon India 2018 Day 1

It has been a while I have gone to a tech conference. I had been to JavaOne, Google Devfest (even when I am not speaking)  and also reported on some of them. The report posts were mostly notes on the things I had learned, for me being able to refer at a later point of time.

Last year PyCon India was in New Delhi, some of my colleagues who worked on Data Science and Python Microservices had been there. I got mixed feedback from them. Some said sessions and workshops were very basic i.e. beginners level and some said topics were too specific and deep to be able to grasp at runtime during a conference.

This year it was the 10th year of the conference and held at my own city Hyderabad. It has been quite sometime I have been working on Python-based projects and running them in production also, so there was no excuse to miss that.

I did not really get a chance to attend the workshops but got a chance to represent Google Developer Community on the conference day.

Day 1

The day started with a…

Ikea Hyderabad

Ikea opened their operations in Hyderabad last month and there has been a craze in the city people about it since then.  A lot of videos were circulated showing uncontrollable crowed pushing each other at the entrance of the store when it first opened. I am sure there were a lot of people just wanted to go once just to see what is it all about and later go only when you want to buy something.

It has been a month and my friends told me the crowd has been reduced lot. So I decided to go. Their home page showed there was no waiting time at the entrance. There is a lot of parking in the store itself which is not really advertised and on a Sunday morning, it was really empty.

Ikea in Hyderabad is also quite famous for its affordable food in the restaurant so we gave it a shot. I was really surprised to see it was truly self-service i.e. you pick up food items and bill only at the end just before finding a table. I have not seen this in India.

Then we visited the furniture section which is…

Spark, Dataframes, PostgreSQL

Spark is one of the most successful projects at Apache with one of the most popular skills for Big data engineers and a lot of companies look out for this specific skills while hiring.

Spark is a distributed computing software where it can employ multiple machines (cluster) which means you can scale horizontally (scale out) by adding more and more computers instead of having to buy/rent computers with higher CPU and Memory (scaling vertically/ scaling up ).

Setting up (Standalone Mode)
brew install apache-spark Run master : /usr/local/Cellar/apache-spark/2.3.1/bin/spark-class org.apache.spark.deploy.master.Master Run Slave(s) : /usr/local/Cellar/apache-spark/2.3.1/bin/spark-class org.apache.spark.deploy.worker.Worker  spark://:7077 -c 1 -m 512M  you will get the master url in the console output after running step 1 and you can run slaves either in another terminal or on another computer which is connected to the same network. Run example on master : /usr/local/Cellar/apache-spark…

Microsoft Azure - first impressions

So, after GCP and AWS it was my time to get my hands dirty with Azure. When I initially heard Microsoft is jumping onto the bandwagon of public cloud vendors,  it seemed like a wannabe. GCP itself had a lot of catch up to be done on with AWS. IBM and Oracle have been trying it for long but they have settled for a different genre of customers e.g. Oracle is primarily focusing only on deploying its own legacy software like e-business suite and modern on-premise software like Oracle Fusion into its Oracle Public Cloud instead of trying to get a market share from AWS or GCP. IBM has managed to get some real customers like EA but not quite there yet.
On the other hand, I think Microsoft Azure has come from behind and not only given some serious challenges to GCP and AWS but also capturing a major market share with its unique offerings in Machine Learning services and fantastic partnerships and support with corporations.

With whatever I have explored so far, I have mixed opinion about it. …


I had to travel to Atlanta last week for one of the meetings at work. Finally I made it to all 4 timezones of USA. Austin, Salt Lake City, Seattle and now Atlanta. My office was in Roswell which is outside of Atlanta and quite a lonely place at times. May be I was staying in quite far from happening, I did not see much people walking on the streets (sitewalks to be precise :) ).

Harvesting Your Old Smartphone Senors

If you check the list of senors that even a basic smartphone has in the manual you will be surprised to see a long list. Ambient light senor, Gyroscope, Magnatometer, GPS, Sound noise level, Gravity etc are very common even in the cheapest smartphones. If you have a old smartphone, which you may not be using for some other issues, can stil have all the sensors working properly. You can stream the senor data to another computer or cloud, which you can process and possibly perform some actions.

I used an app called Sensor Node where you can mention the IOT server address and Topic name where the phone's sensor data should be streamed to.

Finally the following web utility (thanks to ) can be used to visualise the data in real time.

Shooting the Super Blue Blood Moon

There was a very rare astronomical event on this 31st Jan. It moon was at the closest position relative to earth on this full moon day. Hence Super Moon. It was also the second full moon of the month, so it was also a Blue Moon. Remember the saying "Once in a blue moon"? It means "rare" event. Obviously having two full moons in a month is rare. Usually, they occur on the very first and last day of the month as there is always a gap of a month between them. So it was 1st and 31st Jan. Now the Blood Moon meant the moon was going to turn Blood Red for some time. It happened during total lunar eclipse when Moon does not get any direct sunlight (due to earth's shadow falling on it) but it gets the light refracted from earth's atmosphere (mostly from the limbs) which turns moon red.

In the past, I had seen a lot of eclipse montages where people stitch various stage of eclipsed Sun or Moon into a single photo. I tried the same this time as well. Used an 18-55 Can…