Blog

Harnessing Data Science For Agricultural Crops Prediction

Published May 24, 2023

Using AI to determine optimal crop selection can have a significant impact on ending world hunger, particularly in poor developing countries like Nepal. Even though Nepal is an agricultural country, we import a huge amount of food materials every year which make the trade deficit of Nepal even worse. By leveraging AI technologies such as machine learning and data analysis, we can analyze various factors like climate, soil quality, market demand, and available resources. This data-driven approach helps farmers make informed decisions about which crops to plant, increasing agricultural productivity and improving food security. AI algorithms can provide personalized recommendations based on local conditions, enabling farmers to optimize their yields, reduce crop failures, and maximize profits. Ultimately, AI-driven crop selection can contribute to sustainable farming practices, alleviate poverty, and combat hunger in Nepal and similar regions worldwide.

Building my own IMDB movies dataset using Scrapy

Published April 12, 2023

Scrapy, a free and open-source web crawling framework, was first release on June 26, 2008 licensed under BSD. It is used for web scraping and extracting structured data. Scrapy is widely used by many employers, for both freelancing and in-house jobs for various applications from data mining to monitoring websites. or changes. Scrapy is popularly used due to its ability to handle requests asynchronously, i.e., it can execute multiple tasks concurrently, such as making requests, processing responses, and saving data. It allows you to focus on the data extraction process utilizing CSS selectors and XPath expressions rather than the complex internals of how spiders are supposed to function. With Scrapy, you can easily build complex scraping pipelines that can handle multiple pages, parse complex HTML structures, and store data in various formats.

Code for Nepal and DataCamp Partnership: Empowering Data Literacy and Career Growth

Published April 11, 2023

Code for Nepal is a non-profit organization that aims to increase digital literacy and access to technology in Nepal. We focus on providing training and resources for coding and data literacy, as well as advocating for policies and programs that support technology education and access. Code for Nepal also works to promote the use of technology to address social issues and improve the lives of Nepali people.

Survey Form Deployment and Data Pipeline

Published March 28, 2023

This project is about collecting data by doing survey. Simply, survey data on social media usage is collected. We will be using deploying a single page that contains the survey form to the public internet. We will collect all the data on the MongoDB database. We will create an API that helps to fetch all the survey data from the MongoDB database. You can see the deployed survey form in this link.
After deploying the survey form, we will create a data pipeline that consists of fetching the data from the database and cleaning the data for further use in visualization and analysis. We will be using Airflow for scheduling these two processes one after another. Airflow is an open-source platform for developing, scheduling, and monitoring different workloads. It provides a graphical user interface to manage our workloads.

Workflow of Data Engineering Project on AWS

Published March 27, 2023

1.    Architecture Diagram:

A/B testing Implementation Using Python

Published March 16, 2023

In the business domain, it’s important to recognize the significance of using data to make informed decisions that can improve the efficacy of the product or marketing strategies. A/B testing or Bucket testing is a widely used and highly effective statistical method for optimizing digital assets such as web pages, emails, and ads. By testing different variations of these assets, businesses can gather valuable data on how each version performs and use that information to make data-driven decisions about which elements to include or remove. A/B testing can be used to optimize a variety of factors, including headlines, images, copy, calls-to-action, and more, and can lead to significant improvements in key metrics such as click-through rates, conversion rates, and revenue. This blog post will guide you through a detailed process for implementing A/B testing along with an example implementation using python.

Data Pipeline with Apache Airflow and Fast API

Published February 21, 2023

The word Pipeline in general triggers a connections like image in our mind. Well data pipelines are no differ from that. These days organizations/institutions have huge volume of data which are generated through various sources, these data are known as “Big Data”. For processing these big data efficiently such that it saves time and improves results we require data pipeline.Data pipelines are created for ETL process, where data is collected from various sources, it is then transformed into desired format and finally stored.