Scrapers and Automations
Scraping / Automation / Bot / Python

MB Report Creator

All payment companies in Turkey notify the Central Bank of the Republic of Turkey on a quarterly, semi-annual and annual basis. Usually companies have Excel files that they pull from their own systems or that they have prepared manually. These data are edited and turned into TXT files. Appropriate names are given to the files. Files are compressed. It is uploaded to the Central Bank's system.

I developed a tool that automates these periodic processes. We choose the notification type via the graphical interface. Accordingly, the file types appear below. We select the appropriate file type and file. Enter the information on the right. The file related to the Prepare File command is being prepared. We prepare all the files that we need to prepare according to the notification type. Files created with the Prepare Report command are compressed. Now our report is ready to upload to the system.

This project can be customized for use by any company. For more information, please refer to the Contact section.

[TR]
Türkiye'deki tüm ödeme şirketleri 3 aylık, 6 aylık ve yıllık periyotlarla Türkiye Cumhuriyet Merkez Bankası'na bildirimlerde bulunur. Genellikle şirketlerin kendi sistemlerinden çektikleri veya elle hazırladıkları Excel dosyaları vardır. Bu datalar düzenlenerek TXT dosyaları haline getirilir. Dosyalara uygun isimler verilir. Dosyalar sıkıştırılır. Merkez Bankası'nın sistemine yüklenir.

Periyodik olarak yapılan bu işlemleri otomatik hale getiren bir araç geliştirdim. Grafik arayüz üzerinden bildirim türü seçiyoruz. Buna göre dosya türleri aşağıda beliriyor. Uygun dosya türünü ve dosyayı seçiyoruz. Sağ taraftaki bilgileri giriyoruz. Dosyayı Hazırla komutu ile ilgili dosya hazırlanıyor. Bildirim türüne göre hazırlamamız gereken tüm dosyalar hazırlıyoruz. Raporu Hazırla komutu ile oluşturulan dosyalar sıkıştırılıyor. Artık raporumuz sisteme yüklemeye hazır.

Bu proje herhangi bir şirket tarafından kullanılmak üzere özelleştirilebilir. Daha fazla bilgi için lütfen İletişim bölümüne bakın.

Telegram Bot

Telegram is a popular instant messaging app. One-to-one and group messaging can be done on Telegram. Many companies carry out their corporate communication via telegram. These processes can be automated with the Python Bot I wrote using Telegram APIs. With this bot, you can send messages, send media, send locations, read and process incoming messages.

Binance Scraper

Binance is the world's most traded cryptocurrency exchange. With the Python script I coded, I can instantly get the exchange rates on Binance. I can do this with Binance Public API and the scraping method.

With Binance APIs, exchange rate tracking, deposit/withdrawal, and many other transactions can be done.

CoinMarketCap Scraper

Coinmarketcap is the most trusted website with cryptocurrency exchange rates and news about cryptocurrencies. I can get instant rate information with the Python script I have coded.

Yelp Scraper

Yelp is a website that lists businesses and user reviews. Here, there are many data such as address and phone information, photos, user comments. I can get this information with the scraper that I coded with Python.

Google Sheets Bot

Google Sheets is a web application with which you can create online spreadsheets. By using the API with the Python script I coded, the operations on the tables can be automated. This script can be used for creating a table, creating a sheet, entering data into a cell, extracting data from a cell, and working on more than one cell.

Selenium Login & Screenshot

Selenium is an automation tool. But when combined with Python, it can do much more. Routine work can be done with Selenium. Data that the Requests library cannot access can be accessed with Selenium.

Many websites list data after login. Although Selenium is designed as an automation tool, it can be used for scraping as a last resort.

I can do automation, login, and test operations with Selenium. I can capture the main view screenshot, full page screenshot, and HTML tag screenshot of a website.

FAANG Scraper

FAANG is a word created from the first letters of Facebook, Amazon, Apple, Netflix, and Google companies. These companies are the largest in the technology space. These companies have many employees. Every day, dozens of people are hired by these companies.

I get job postings on the career pages of these companies with the scrapers I coded with Python. I parse the data and add it to the database.

Linkedin Scraper

Linkedin Scraper is a Python application with full Airtable integration. It places at a Linux base VPS. It gets company code from Airtable companies table, collects the data and writes data on Airtable jobs table. User controls the application on Airtable companies table with changes Status column. It can also work on other databases.

Indeed Scraper

Indeed is a career site listing millions of job postings. Job postings on this site are classified using subdomains according to country codes. With the scraper I coded with Python, I can get job postings on Indeed by country and keyword information. I parse this information and save it in JSON and CSV formats.

Google Jobs Scraper

Google Jobs collects job postings on career sites and job postings on companies' career pages and presents them to users on a single page. It lists results based on location and keywords.

With this scraper that I coded with Python, I get the search results and save them in CSV format. Career platforms can be created using this scraper.

Other Projects