Close Menu
IOupdate | IT News and SelfhostingIOupdate | IT News and Selfhosting
  • Home
  • News
  • Blog
  • Selfhosting
  • AI
  • Linux
  • Cyber Security
  • Gadgets
  • Gaming

Subscribe to Updates

Get the latest creative news from ioupdate about Tech trends, Gaming and Gadgets.

What's Hot

How to Run MySQL Queries from Linux Command Line

November 30, 2025

Key metrics and AI insights

November 30, 2025

Started the SmartHütte Podcast

November 30, 2025
Facebook X (Twitter) Instagram
Facebook Mastodon Bluesky Reddit
IOupdate | IT News and SelfhostingIOupdate | IT News and Selfhosting
  • Home
  • News
  • Blog
  • Selfhosting
  • AI
  • Linux
  • Cyber Security
  • Gadgets
  • Gaming
IOupdate | IT News and SelfhostingIOupdate | IT News and Selfhosting
Home»Linux»How to Run MySQL Queries from Linux Command Line
Linux

How to Run MySQL Queries from Linux Command Line

MarkBy MarkNovember 30, 2025No Comments2 Mins Read
How to Run MySQL Queries from Linux Command Line


Unlock the full power of your database administration skills by mastering MySQL and MariaDB queries directly from the Linux command line. This comprehensive guide is tailored for tech-savvy readers eager to streamline their workflows, automate repetitive tasks, and perform efficient data extraction. Discover how to execute complex queries, save results to files, format output for various needs, and leverage powerful shell scripting techniques for unparalleled control. Dive in to elevate your database management with practical examples and indispensable tips.

Before we dive into the commands, let’s set up the example databases we’ll be working with throughout this guide, so you can follow along and practice these techniques on your own system.

Setting Up Your Database Environment

To fully grasp the power of command-line database interaction, we’ll start by setting up two distinct databases: a simple tecmintdb for basic operations and a more complex employees database for advanced queries involving joins and aggregations.

Creating the tecmintdb Database

First, let’s create the tecmintdb database and a tutorials table within it.

mysql -u root -p -e "CREATE DATABASE IF NOT EXISTS tecmintdb;"

Next, to create a database table named tutorials in the database tecmintdb, run the command below:

sudo mysql -u root -p tecmintdb << 'EOF'
CREATE TABLE IF NOT EXISTS tutorials (
    tut_id INT NOT NULL AUTO_INCREMENT,
    tut_title VARCHAR(100) NOT NULL,
    tut_author VARCHAR(40) NOT NULL,
    submission_date DATE,
    PRIMARY KEY (tut_id)
);

INSERT INTO tutorials (tut_title, tut_author, submission_date) VALUES
('Getting Started with Linux', 'John Smith', '2024-01-15'),
('Advanced Bash Scripting', 'Sarah Johnson', '2024-02-20'),
('MySQL Database Administration', 'Mike Williams', '2024-03-10'),
('Apache Web Server Configuration', 'Emily Brown', '2024-04-05'),
('Python for System Administrators', 'David Lee', '2024-05-12'),
('Docker Container Basics', 'Lisa Anderson', '2024-06-18'),
('Kubernetes Orchestration', 'Robert Taylor', '2024-07-22'),
('Linux Security Hardening', 'Jennifer Martinez', '2024-08-30');
EOF

Verify the data was inserted correctly:

sudo mysql -u root -p -e "USE tecmintdb; SELECT * FROM tutorials;"

Creating the employees Database

Now, let’s create a more complex employees database with multiple related tables. This database will be used for our advanced query examples.

sudo mysql -u root -p -e "CREATE DATABASE IF NOT EXISTS employees;"

Create the employees table:

sudo mysql -u root -p employees << 'EOF'
CREATE TABLE IF NOT EXISTS employees (
    emp_no INT NOT NULL AUTO_INCREMENT,
    birth_date DATE NOT NULL,
    first_name VARCHAR(14) NOT NULL,
    last_name VARCHAR(16) NOT NULL,
    gender ENUM('M','F') NOT NULL,
    hire_date DATE NOT NULL,
    PRIMARY KEY (emp_no)
);

INSERT INTO employees (emp_no, birth_date, first_name, last_name, gender, hire_date) VALUES
(10001, '1953-09-02', 'Georgi', 'Facello', 'M', '1984-06-02'),
(10002, '1964-06-02', 'Bezalel', 'Simmel', 'F', '1984-11-21'),
(10003, '1959-12-03', 'Parto', 'Bamford', 'M', '1984-08-28'),
(10004, '1954-05-01', 'Chirstian', 'Koblick', 'M', '1984-12-01'),
(10005, '1955-01-21', 'Kyoichi', 'Maliniak', 'M', '1984-09-15'),
(10006, '1953-04-20', 'Anneke', 'Preusig', 'F', '1985-02-18'),
(10007, '1957-05-23', 'Tzvetan', 'Zielinski', 'F', '1985-03-20'),
(10008, '1958-02-19', 'Saniya', 'Kalloufi', 'M', '1984-07-11'),
(10009, '1952-04-19', 'Sumant', 'Peac', 'F', '1985-02-18'),
(10010, '1963-06-01', 'Duangkaew', 'Piveteau', 'F', '1984-08-24');
EOF

Create the salaries table:

sudo mysql -u root -p employees << 'EOF'
CREATE TABLE IF NOT EXISTS salaries (
    emp_no INT NOT NULL,
    salary INT NOT NULL,
    from_date DATE NOT NULL,
    to_date DATE NOT NULL,
    PRIMARY KEY (emp_no, from_date),
    FOREIGN KEY (emp_no) REFERENCES employees(emp_no) ON DELETE CASCADE
);

INSERT INTO salaries (emp_no, salary, from_date, to_date) VALUES
(10001, 60117, '1984-06-02', '1985-06-02'),
(10001, 62102, '1985-06-02', '1986-06-02'),
(10001, 66074, '1986-06-02', '9999-01-01'),
(10002, 65828, '1984-11-21', '1985-11-21'),
(10002, 65909, '1985-11-21', '9999-01-01'),
(10003, 40006, '1984-08-28', '1985-08-28'),
(10003, 43616, '1985-08-28', '9999-01-01'),
(10004, 40054, '1984-12-01', '1985-12-01'),
(10004, 42283, '1985-12-01', '9999-01-01'),
(10005, 78228, '1984-09-15', '1985-09-15'),
(10005, 82507, '1985-09-15', '9999-01-01'),
(10006, 40000, '1985-02-18', '1986-02-18'),
(10006, 43548, '1986-02-18', '9999-01-01'),
(10007, 56724, '1985-03-20', '1986-03-20'),
(10007, 60605, '1986-03-20', '9999-01-01'),
(10008, 46671, '1984-07-11', '1985-07-11'),
(10008, 48584, '1985-07-11', '9999-01-01'),
(10009, 60929, '1985-02-18', '1986-02-18'),
(10009, 64604, '1986-02-18', '9999-01-01'),
(10010, 72488, '1984-08-24', '1985-08-24'),
(10010, 74057, '1985-08-24', '9999-01-01');
EOF

Create the departments table for more complex joins:

sudo mysql -u root -p employees << 'EOF'
CREATE TABLE IF NOT EXISTS departments (
    dept_no CHAR(4) NOT NULL,
    dept_name VARCHAR(40) NOT NULL,
    PRIMARY KEY (dept_no),
    UNIQUE KEY (dept_name)
);

INSERT INTO departments (dept_no, dept_name) VALUES
('d001', 'Marketing'),
('d002', 'Finance'),
('d003', 'Human Resources'),
('d004', 'Production'),
('d005', 'Development'),
('d006', 'Quality Management');
EOF

Create the dept_emp table to link employees to departments:

sudo mysql -u root -p employees << 'EOF'
CREATE TABLE IF NOT EXISTS dept_emp (
    emp_no INT NOT NULL,
    dept_no CHAR(4) NOT NULL,
    from_date DATE NOT NULL,
    to_date DATE NOT NULL,
    PRIMARY KEY (emp_no, dept_no),
    FOREIGN KEY (emp_no) REFERENCES employees(emp_no) ON DELETE CASCADE,
    FOREIGN KEY (dept_no) REFERENCES departments(dept_no) ON DELETE CASCADE
);

INSERT INTO dept_emp (emp_no, dept_no, from_date, to_date) VALUES
(10001, 'd005', '1984-06-02', '9999-01-01'),
(10002, 'd005', '1984-11-21', '9999-01-01'),
(10003, 'd004', '1984-08-28', '9999-01-01'),
(10004, 'd004', '1984-12-01', '9999-01-01'),
(10005, 'd003', '1984-09-15', '9999-01-01'),
(10006, 'd005', '1985-02-18', '9999-01-01'),
(10007, 'd004', '1985-03-20', '9999-01-01'),
(10008, 'd005', '1984-07-11', '9999-01-01'),
(10009, 'd006', '1985-02-18', '9999-01-01'),
(10010, 'd006', '1984-08-24', '9999-01-01');
EOF

Verify everything is set up correctly by listing the tables:

sudo mysql -u root -p -e "USE employees; SHOW TABLES;"

Now you’ve got both databases set up with sample data, and you can follow along with all the examples in this guide. The tecmintdb database is perfect for simple queries, while the employees database lets you practice more complex operations like joins and aggregations.

Executing and Managing MySQL/MariaDB Queries

The core of command-line database interaction lies in effectively executing queries and handling their output. Let's explore the essential techniques.

Basic Query Execution and Database Exploration

To view all the databases on your server, you can issue the following command:

sudo mysql -u root -p -e "show databases;"

You can also use the -e flag to execute any SQL statement directly, making it incredibly versatile for quick checks or single-statement operations.

sudo mysql -u root -p -e "USE tecmintdb; SELECT * FROM tutorials;"

Saving Query Results to a File

One of the most powerful features of running queries from the Linux command line is the ability to pipe output directly to a file. This is particularly invaluable when dealing with queries that return massive datasets, allowing for offline analysis or integration with other scripts.

We will use the following command and pipe the output to the tee command, followed by the filename where we want to store the output. For illustration, we will use the employees database and a simple join between the employees and salaries tables. Note that you will be prompted to enter the password for the database user:

sudo mysql -u root -p -e "USE employees; SELECT DISTINCT A.first_name, A.last_name FROM employees A JOIN salaries B ON A.emp_no = B.emp_no WHERE hire_date < '1985-01-31';" | tee queryresults.txt

You can then view the query results with the help of the cat command:

cat queryresults.txt

With the query results in plain text files, you can process the records more easily using other command-line utilities. Now that you’ve seen the basics, let’s explore some more advanced techniques that’ll make your command-line database work even more efficiently.

Formatting Output for Better Readability

The default table format is great for viewing in the terminal, but sometimes you need different formats. You can output results in a vertical format, which is particularly useful when dealing with tables that have many columns, preventing horizontal scrolling and improving clarity.

sudo mysql -u root -p -e "USE employees; SELECT * FROM employees LIMIT 1\G"

The \G at the end displays each row vertically instead of in a traditional table, so instead of seeing a cramped horizontal table, you get something like:

*************************** 1. row ***************************
    emp_no: 10001
birth_date: 1953-09-02
first_name: Georgi
 last_name: Facello
    gender: M
 hire_date: 1984-06-02

Exporting to CSV Format for Data Analysis

When you need to import query results into spreadsheet applications or other tools for deeper analysis, CSV (Comma Separated Values) format is your best bet. This is a common requirement for data extraction tasks.

sudo mysql -u root -p -e "USE employees; SELECT first_name, last_name, hire_date FROM employees WHERE hire_date < '1985-01-31';" | sed 's/\t/,/g' > employees.csv

This command pipes the tab-separated output from MySQL through sed to replace tabs with commas, creating a proper CSV file that opens cleanly in Excel, LibreOffice Calc, or any other spreadsheet software.

Advanced Query Execution and Automation

Beyond basic querying, the Linux command line excels at automating and handling complex database operations, crucial for efficient MySQL/MariaDB administration.

Running Queries Without Password Prompts

If you’re automating database tasks with cron jobs or custom scripts, you don’t want to manually enter passwords every time. This is where MySQL configuration files come in handy for streamlined database automation.

Create a file at ~/.my.cnf with your credentials:

[client]
user=root
password=your_password_here

Then secure it so only you can read it:

chmod 600 ~/.my.cnf

Now you can run queries without the -p flag and without being prompted:

mysql -e "SHOW DATABASES;"

Just remember, storing passwords in plain text files has security implications, so only use this approach on servers where you control access. For production environments, consider using MySQL’s more secure authentication methods or environment variables.

Executing Complex Multi-Line Queries

Sometimes your queries are too complex to write in a single command line, especially when you’re dealing with multiple joins, subqueries, or intricate conditions. For this, storing SQL in a separate file is a much cleaner approach, making your queries part of organized shell scripting.

You can put your SQL in a file and execute it:

cat > complex_query.sql << 'EOF'
USE employees;
SELECT 
    e.first_name,
    e.last_name,
    d.dept_name,
    s.salary
FROM employees e
INNER JOIN dept_emp de ON e.emp_no = de.emp_no
INNER JOIN departments d ON de.dept_no = d.dept_no
INNER JOIN salaries s ON e.emp_no = s.emp_no
WHERE e.hire_date BETWEEN '1985-01-01' AND '1985-12-31'
    AND s.from_date = (
        SELECT MAX(from_date) 
        FROM salaries 
        WHERE emp_no = e.emp_no
    )
ORDER BY s.salary DESC
LIMIT 10;
EOF

Now execute it, directing the output to a file:

sudo mysql -u root -p < complex_query.sql > top_earners_1985.txt

This approach keeps your queries organized and reusable, and you can version control them with Git just like any other code.

Batch Processing Multiple Queries with Shell Scripts

If you need to run several related queries and save each result separately, creating a bash script is the most efficient way. This demonstrates the true power of shell scripting for database management.

#!/bin/bash

QUERIES=(
    "SELECT COUNT(*) as total_employees FROM employees"
    "SELECT dept_name, COUNT(*) as employee_count FROM dept_emp de JOIN departments d ON de.dept_no = d.dept_no GROUP BY dept_name"
    "SELECT YEAR(hire_date) as year, COUNT(*) as hires FROM employees GROUP BY YEAR(hire_date) ORDER BY year"
)

FILENAMES=(
    "total_count.txt"
    "dept_distribution.txt"
    "yearly_hires.txt"
)

for i in "${!QUERIES[@]}"; do
    echo "Running query $((i+1))..."
    mysql -u root -p -e "USE employees; ${QUERIES[$i]}" > "${FILENAMES[$i]}"
    echo "Results saved to ${FILENAMES[$i]}"
done

Save this as a script (e.g., batch_queries.sh), make it executable with chmod +x batch_queries.sh, and you’ve got a reusable batch query tool.

Monitoring Long-Running Queries

When you’re running queries that might take a while, you want to see progress or at least know they’re still working. Combining your query with status output can be very helpful.

(sudo mysql -u root -p -e "USE employees; SELECT COUNT(*) FROM huge_table WHERE complex_condition;" && echo "Query completed at $(date)") | tee query_log.txt

For even longer queries, run them in the background and monitor the MySQL process list:

sudo mysql -u root -p -e "USE employees; SELECT * FROM massive_table;" > output.txt &
sudo watch -n 5 'mysql -u root -p -e "SHOW PROCESSLIST\G" | grep -A 5 "SELECT"'

This runs your query in the background while displaying the process list every 5 seconds, so you can see it’s still working and how much progress it’s made. Look for the SQL query itself in the Command or Info column.

Filtering and Processing Results with Linux Tools

Once you have query results in a text file, you can leverage standard Linux command line tools to process them further, enhancing your data extraction capabilities. Here are some useful patterns:

  • Count the number of result rows (excluding the header):
    tail -n +2 queryresults.txt | wc -l
  • Extract specific columns using awk:
    awk '{print $1, $3}' queryresults.txt
  • Search for specific patterns in results:
    grep -i "engineering" dept_distribution.txt
  • Sort results by a numeric column (e.g., the third column):
    tail -n +2 queryresults.txt | sort -k3 -n

Handling Special Characters and Large Datasets

When your data contains special characters, tabs, or newlines, the default output can get messy. Use the --batch and --raw options for cleaner, machine-readable output:

sudo mysql -u root -p --batch --raw -e "SELECT description FROM products WHERE category='electronics';" > products.txt

For queries that return millions of rows, you might run into memory issues. Instead of loading everything into memory, stream the results directly:

sudo mysql -u root -p --quick -e "SELECT * FROM huge_table;" | gzip > huge_results.txt.gz

The --quick option tells MySQL to retrieve rows one at a time instead of buffering the entire result set, and piping through gzip compresses the output on the fly, saving disk space.

Creating Quick Database Backups

While not technically running queries, you can use similar command-line techniques for quick database dumps with the mysqldump command, a fundamental tool for MySQL/MariaDB administration.

sudo mysqldump -u root -p employees | gzip > employees_backup_$(date +%Y%m%d).sql.gz

Or backup just specific tables:

sudo mysqldump -u root -p employees employees salaries | gzip > critical_tables_$(date +%Y%m%d).sql.gz

Scheduling Automated Reports with Cron or systemd timers

Combine everything we’ve covered to create automated daily reports using cron, or for modern Linux systems, consider systemd timers for greater robustness and logging capabilities. This is a prime example of effective database automation.

#!/bin/bash
REPORT_DATE=$(date +%Y-%m-%d)
REPORT_FILE="/var/reports/daily_stats_${REPORT_DATE}.txt"

{
    echo "Database Statistics Report - ${REPORT_DATE}"
    echo "=========================================="
    echo

    echo "Total Employees:"
    mysql -e "USE employees; SELECT COUNT(*) FROM employees;"
    echo

    echo "New Hires This Month:"
    mysql -e "USE employees; SELECT COUNT(*) FROM employees WHERE MONTH(hire_date) = MONTH(CURRENT_DATE()) AND YEAR(hire_date) = YEAR(CURRENT_DATE());"
    echo

    echo "Department Distribution:"
    mysql -e "USE employees; SELECT d.dept_name, COUNT(*) as count FROM dept_emp de JOIN departments d ON de.dept_no = d.dept_no WHERE de.to_date='9999-01-01' GROUP BY d.dept_name ORDER BY count DESC;"

} > "$REPORT_FILE"

echo "Report generated: $REPORT_FILE"

Add it to cron to run daily at 6 AM (make sure the script path is correct and it's executable):

0 6 * * * /usr/local/bin/generate_db_report.sh

Unique Tip: Consider systemd timers for enhanced scheduling. While cron is venerable, systemd timers offer benefits like better logging, dependency management, and event-driven execution on modern Linux distributions. You can create a .service file for your script and a .timer file to schedule it, providing more control and easier debugging than traditional cron jobs.

Summary

We've explored a wealth of Linux command line tips that system administrators and developers alike can find invaluable for automating daily tasks and performing complex MySQL/MariaDB administration with ease. The key takeaway here is that you don’t always need to fire up the full MySQL/MariaDB shell or rely on heavy GUI tools to work with your databases. The command line provides speed, powerful database automation capabilities, and the ability to seamlessly integrate database operations into your existing shell scripting workflows and custom tools for advanced data extraction.

Do you have any other tips that you would like to share with the rest of the community? If so, please do so using the comment form below.

FAQ

Question 1: Why should I use the Linux command line for MySQL/MariaDB queries instead of a GUI tool?

Answer 1: Using the Linux command line offers several advantages for MySQL/MariaDB administration:

  1. Automation: Easily integrate queries into shell scripts, cron jobs, or systemd timers for automated tasks like reporting, backups, or data processing.
  2. Efficiency: Quickly execute single queries or commands without the overhead of launching a full graphical application.
  3. Remote Access: Perform database operations efficiently over SSH on headless servers where a GUI is unavailable.
  4. Scriptability: Chain commands with other Linux utilities (grep, awk, sed, tee) for advanced data extraction and manipulation.

Question 2: What are the security implications of storing passwords in ~/.my.cnf, and how can I mitigate them?

Answer 2: Storing passwords in ~/.my.cnf in plain text, while convenient for database automation, introduces a security risk if the file is not properly secured. If an unauthorized user gains access to your system, they could potentially read your database credentials.
To mitigate this, ensure the file permissions are set strictly: chmod 600 ~/.my.cnf. This restricts read/write access to only the file owner. For higher security in production environments, consider using alternative methods like MySQL's mysql_config_editor to store encrypted credentials, environment variables, or more robust authentication mechanisms like PAM or Kerberos, especially when dealing with sensitive MySQL/MariaDB administration tasks.

Question 3: How can I handle very large query results without running out of memory on the command line?

Answer 3: When performing data extraction from very large tables, you can prevent memory issues by using MySQL's --quick option (or -q). This tells the client to retrieve rows one at a time, instead of buffering the entire result set in memory before printing. You can combine this with piping the output directly to a file or compression utilities like gzip to save disk space and manage output effectively. For example: mysql -u root -p --quick -e "SELECT * FROM huge_table;" | gzip > huge_results.txt.gz. This ensures efficient handling of massive datasets even on systems with limited memory.



Read the original article

0 Like this
command line Linux MySQL Queries Run
Share. Facebook LinkedIn Email Bluesky Reddit WhatsApp Threads Copy Link Twitter
Previous ArticleKey metrics and AI insights

Related Posts

Linux

Thanksgiving Data Stuffing Recipe – Pixelated Dwarf

November 27, 2025
Linux

Calibre 8.15 Open-Source E-Book Manager Improves the Comments Editor

November 24, 2025
Linux

RustDesk Pulls Ahead of TeamViewer, AnyDesk with Wayland Multi-Scaled Display Support

November 21, 2025
Add A Comment
Leave A Reply Cancel Reply

Top Posts

AI Developers Look Beyond Chain-of-Thought Prompting

May 9, 202515 Views

6 Reasons Not to Use US Internet Services Under Trump Anymore – An EU Perspective

April 21, 202512 Views

Andy’s Tech

April 19, 20259 Views
Stay In Touch
  • Facebook
  • Mastodon
  • Bluesky
  • Reddit

Subscribe to Updates

Get the latest creative news from ioupdate about Tech trends, Gaming and Gadgets.

About Us

Welcome to IOupdate — your trusted source for the latest in IT news and self-hosting insights. At IOupdate, we are a dedicated team of technology enthusiasts committed to delivering timely and relevant information in the ever-evolving world of information technology. Our passion lies in exploring the realms of self-hosting, open-source solutions, and the broader IT landscape.

Most Popular

AI Developers Look Beyond Chain-of-Thought Prompting

May 9, 202515 Views

6 Reasons Not to Use US Internet Services Under Trump Anymore – An EU Perspective

April 21, 202512 Views

Subscribe to Updates

Facebook Mastodon Bluesky Reddit
  • About Us
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms and Conditions
© 2025 ioupdate. All Right Reserved.

Type above and press Enter to search. Press Esc to cancel.