I recently had to analyze web performance metrics of a web platform. I initially used DBeaver to query the data I needed. I quickly realized that I had to somehow save a huge amount of data locally so that I could analyze it. Instead I decided to use Python to accomplish to eliminate the need to move data around.
Here is how you do it.
First install the following libraries: sqlalchemy and pymysql
from sqlalchemy import create_engineimport pymysqlimport pandas as pdsqlEngine = create_engine('mysql+pymysql://=username:password@hostname/dbname')sql_query = "select * from test_table limit 100;"db_connection = sqlEngine.connect() df = pd.read_sql(sql_query, db_connection);db_connection.close()
That’s it.