Introduction and Purpose

The Use Case

This article continues the series on building a Python-based real-time reporting tool for IoT data. As in the first article of this series, we use IoT connectivity data. The dataset contains timestamps, events, sim card IDs, and data usage. This data is typically sent by the IoT devices in the background to let the support engineers do monitoring and timely troubleshooting.

The Objectives

In the previous article, we built a real-time dashboard that pulls data from the GridDB database and refreshes data visualizations automatically. In this article, we will build a simple application with Python’s kivy package. …


In a previous blog on Docker, we ran the GridDB server in one container and the application in another. It worked well but there have been many requests to run GridDB in a container on Docker Desktop and the application on the host.

It should be easy right? Wrong. On a Linux host, it just works but on Windows and MacOSX hosts, the different networking stacks don’t allow direct routing to the container which prevents the usual GridDB configuration from working.

After spending time trying to make Windows or MacOSX Docker behave like Linux, a trick was discovered when configuring…


In the part-1 of the blog, we implemented GridDB python script to save and retrieve the Twitter data. In this blog, we will continue with the sentiment analysis and visualization of the sentiment data. We will calculate the sentiment values for each and every tweet, store the sentiment values, and visualize them to draw useful insights for the popular fashion brands. Furthermore, we will also implement some data science algorithms like Hierarchical Cluster and visualize it using Dendrograms. …


In this tutorial, we will see how to analyze time-series data stored in GridDB using Python. The outline of the tutorial is as follows —

1. Loading the dataset using SQL and Pandas
2. Preprocess the data to deal with null, missing values, etc.
3. Build a classifier for our data

Prerequisites

This tutorial assumes prior installation of GridDB, Python3, and the associated libraries. If you have not installed any of the below packages, go ahead and do it before continuing with the tutorial. 1. GridDB 2. Python 3 3. GridDB Python Client 4. NumPy 5. Pandas 6. Matplotlib 7. Scikit-learn


This blog is a continuation of our previous cryptocurrency blog found here

With Bitcoin currently leading the cryptocurrency market, as of this writing, its market cap stands at $1,083,949,725,691l; and by the time we publish this blog post, it may go even higher, all thanks to its high volatility.

Satoshi Nakamoto invented the Bitcoin network in 2009, which was then followed by the cryptocurrency’s first-ever recorded commercial transaction where Laszlo Hanyecz, a programmer, purchased two Papa John’s Pizza for 10,000 bitcoins. Back then, the coin didn’t hold any real value and wasn’t that big of a deal. But in the…


Introduction

Today, we will cover how to scrape data from any website using Python’s library Scrapy. We will then save the data in a JSON and HTML file. Finally, we will see how we can also store this data in GridDB for long-term and efficient use.

Pre-requisites

This post requires the prior installation of the following:

We also recommend installing Anaconda Navigator, if not already installed. Anaconda provides a large range of tools for data scientists to experiment with. Also, a virtual environment can help you meet the specific version requirements while running an application without interfering with the actual system paths.

Creating a new project using Scrapy


Introduction

In a recent blog, we covered querying geospatial data with manual bounding boxes. In this blog, we’ll take the same dataset/query but perform the query using GeoHashes instead.

GeoHashes are alpha numeric encodings of areas on earth. The more bits in the encoding, the more precise the area is prescribing. A 1-digit long Geohash covers approximately 2500km while an 8-digit long Geohash covers approximately 20m. For example, the 9q geohash covers most of California and the Western United States while San Francisco / San Jose bay area is spread over several geohashes, 9qb, 9qc, 9q8, and 9q9.


Introduction

Today, we will be building a Bank Loan Classification model from scratch using the data stored in GridDB. In this post, we will cover the following:

  1. Storing the data in GridDB
  2. 2. Extracting the data from GridDB
  3. 3. Building a Logistic Regression Model using Pandas
  4. 4. Evaluating our model using heat map and correlation matrix

We will begin with installing the prerequisites and setting up our environment. We will be using GridDB’s Python connector for this tutorial as the primary language used to model building is Python. …


Introduction: What is a Phishing Website?

Curiosity alone can lead to getting your personal information leaked to bad actors. Are you the type that just clicks on any link a friend sends to you? It’s dangerous! Hackers have so many ways of gathering information — they can gain your trust by social engineering and make you to do the things you would not do if it were just asked by a normal friend, and one of them is making you click on links.

Phishing is a great technique used by hackers to gather information: passwords, emails, name, etcetera. …


In this project, we use GridDB to create a Machine Learning platform where we Kafka is used to import stock market data from Alphavantage, a market data provider. Tensorflow and Keras train a model that is then stored in GridDB, and then finally uses LSTM prediction to find anomalies in daily intraday trading history. The last piece is that the data is visualized in Grafana and then we configure GridDB to send notifications via its REST Trigger function to Twilio’s Sendgrid.

The actual machine learning portion of this project was inspired by posts on Towards Data Science and Curiously. This…

Israel Imru

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store