Friday, January 21, 2022

Glimpse of MySQL 8.0.28 Release

 MySQL 8.0.28 version is GA!!! , ready to use in production.

Below is high level overview , more can be viewed from the weblink.

[1] Disabling Audit Logging for all connecting and connected sessions

SET GLOBAL audit_log_disable = true;

Setting audit_log_disable to true disables the audit log plugin. The plugin is re-enabled when audit_log_disable is set back to false, which is the default setting.


[2]- [Under Performance Schema] A new statement metric, CPU_TIME, is now available, enabling you to measure the CPU time spent on a query.

This is helpful to troubleshoot performance of SQL statements specially hardware resource utilize by the SQL statement , below is the there table you can use to monitor the bottlenecks

[3]- InnoDB now supports ALTER TABLE ... RENAME COLUMN operations using ALGORITHM=INSTANT

Operations that support ALGORITHM=INSTANT only modify metadata in the data dictionary. No exclusive metadata locks are taken on the table during preparation and execution phases of the operation, and table data is unaffected, making the operations instantaneous. If not specified explicitly, ALGORITHM=INSTANT is used by default by DDL operations that support it.

[4]- Replication: When the PAD_CHAR_TO_FULL_LENGTH SQL mode was enabled on a replica server, trailing spaces could be added to a replication channel’s name in the replication metadata repository tables, resulting in errors in replication operations that identified the channel using that data. The issue has now been fixed in MySQL 8.0 by using VARCHAR for character columns, and in MySQL 5.7 by disabling the SQL mode when reading from those tables. Thanks to Brian Yue for the contribution. (Bug #33213841)

[5.]Producing a per-table dump using mysqldump in MySQL 5.7 and 8.0 requires a longer execution time compared to MySQL 5.6. This is because the information_schema.files table, which is queried for information on log file groups by mysqldump, contains information about InnoDB data files as well as NDB data files from MySQL 5.7.
In MySQL 8.0 , the issue has been fixed by rewriting the query to select only the appropriate data files. In MySQL 5.7, Information Schema tables do not have indexes, so a full table scan is still required. (Bug #29210990, Bug #93875)

Tuesday, January 18, 2022

How to Migrate From Azure for MySQL to MDS

Let's walk through on how to migrate sakila database from Azure for MySQL into Oracle Cloud MySQL Database Service(MDS).

For the migration ,let's use MySQL Shell Utility features

1.       Dump Schema Utility

a.       This will help us to export the schema of Azure for MySQL Instance into Oracle Cloud Object Storage.

2.       Load Dump Utility

a.       This is help us to Import the schema from Object Storage to MySQL Database Service(MDS) at Oracle Cloud .

 

How does Migration Work ?

Suppose you wanted to do lift-shift of database called “Sakila” , so “Schema Dump” utility will export sakila database from sources(in this example ,source is AZURE Cloud) to OCI(oracle cloud infrastructure) object storage.

Then “Load Dump” utility will import directly to MySQL Database Service running in Oracle Cloud as which is managed service of MySQL.

Below diagram I have made for clear understanding…






What do we needed handy ?

1.       MySQL Shell 8.0.x Version .

2.       Azure for MySQL Up and Running.

3.       MySQL database service(MDS) Up and Running.

4.       Install OCI CLI on Azure VM Machine.

5.       Install OCI CLI on OCI(Oracle Cloud Compute Instance)

6.       local_infile variables must be ON  for destination machine.

How to Launch MySQL database service (MDS) ?

Below is my another blog will help us to create MDS instance:-

https://mysqlsolutionsarchitect.blogspot.com/

 Additional Details

Azure VM details

Azure for MySQL Details

Oracle Cloud Compute Instance Details

MDS

Details

Public IP address:-

20.198.96.xx

SSH PPK file

User: azureuser

Database Name:- Sakila

End point IP:- azuremysql-dbs.mysql.database.azure.com

User:- admin123

Port: 3306

No of Tables: 23

Public IP Address: 

SSH ppk file

User:- opc

MDS Private IP

Address:- 10.0.0.xx

 

 First Install OCI CLI in Azure compute instance

bash -c "$(curl -L https://raw.githubusercontent.com/oracle/oci-cli/master/scripts/install/install.sh)"

 

exec -l $SHELL

 cd /root/bin/

oci setup config

>>follow the steps

cat /root/.oci/oci_api_key_public.pem

 Paste the o/p of public key into OCI console->user settingsà paste

 


 

 

Command to Export the backup from Azure for MySQL to OCI Object Storage

Open MySQL Shell and connect to Azure compute instance(VM) then execute below commands

Connect azure for mysql Instance

 mysqlsh -hazuremysql-dbs.mysql.database.azure.com -uadmin123@azuremysql-dbs -pWelcome1!

[root@chandan-vm oltp_legacy]# mysqlsh -hazuremysql-dbs.mysql.database.azure.com -uadmin123@azuremysql-dbs -pWelcome1!

MySQL Shell 8.0.23

 Copyright (c) 2016, 2021, Oracle and/or its affiliates.

Oracle is a registered trademark of Oracle Corporation and/or its affiliates.

Other names may be trademarks of their respective owners.

 Type '\help' or '\?' for help; '\quit' to exit.

WARNING: Using a password on the command line interface can be insecure.

Creating a session to 'admin123%40azuremysql-dbs@azuremysql-dbs.mysql.database.azure.com'

Fetching schema names for autocompletion... Press ^C to stop.

Your MySQL connection id is 64630

Server version: 8.0.15 Source distribution

No default schema selected; type \use <schema> to set one.

 MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS >


Check for Compatibility issues and gather the problems in advance before migration by using DRYRUN

 

MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS > util.dumpSchemas(["sakila"], "sakilaDBdump", {dryRun: true, ocimds: true})

MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS > util.dumpSchemas(["sakila"], "sakilaDBdump", {dryRun: true, ocimds: true})

Acquiring global read lock

Global read lock acquired

Gathering information - done

All transactions have been started

Locking instance for backup

NOTE: Backup lock is not supported in MySQL 5.7 and DDL changes will not be blocked. The dump may fail with an error or not be completely consistent if schema changes are made while dumping.

Global read lock has been released

Checking for compatibility with MySQL Database Service 8.0.23

NOTE: MySQL Server 5.7 detected, please consider upgrading to 8.0 first. You can check for potential upgrade issues using util.checkForServerUpgrade().

ERROR: Procedure sakila.film_not_in_stock - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: Procedure sakila.rewards_report - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: Procedure sakila.film_in_stock - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.nicer_but_slower_film_list - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.sales_by_film_category - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.actor_info - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.film_list - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.sales_by_store - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.staff_list - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

ERROR: View sakila.customer_list - definition uses DEFINER clause set to user `admin123`@`%` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges (fix this with 'strip_definers' compatibility option)

Compatibility issues with MySQL Database Service 8.0.23 were found. Please use the 'compatibility' option to apply compatibility adaptations to the dumped DDL.

Util.dumpSchemas: Compatibility issues were found (RuntimeError)

util.dumpSchemas(["sakila"], "sakilaDBdump", {dryRun: true, ocimds: true, "compatibility": ["strip_definers", "strip_restricted_grants"]})

 

FIX:-

MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS > util.dumpSchemas(["sakila"], "sakilaDBdump", {dryRun: true, ocimds: true, "compatibility": ["strip_definers", "strip_restricted_grants"]})

Acquiring global read lock

Global read lock acquired

Gathering information - done

All transactions have been started

Locking instance for backup

NOTE: Backup lock is not supported in MySQL 5.7 and DDL changes will not be blocked. The dump may fail with an error or not be completely consistent if schema changes are made while dumping.

Global read lock has been released

Checking for compatibility with MySQL Database Service 8.0.23

NOTE: MySQL Server 5.7 detected, please consider upgrading to 8.0 first. You can check for potential upgrade issues using util.checkForServerUpgrade().

Writing DDL for table `sakila`.`inventory`

Writing DDL for table `sakila`.`rental`

Writing DDL for table `sakila`.`category`

Writing DDL for table `sakila`.`address`

Writing DDL for table `sakila`.`city`

 MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS >

MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS >util.dumpSchemas(["sakila"],"sakilaDBdump",{threads:30,"osBucketName": "chandanBucket", "osNamespace": "idazzjlcjqzj",  "ocimds": "true","ociConfigFile":"/root/.oci/config", "compatibility": ["strip_definers", "strip_restricted_grants"]})

Output

 

MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS > util.dumpSchemas(["sakila"],"sakilaDBdump",{threads:30,"osBucketName": "chandanBucket", "osNamespace": "idazzjlcjqzj",  "ocimds": "true","ociConfigFile":"/root/.oci/config", "compatibility": ["strip_definers", "strip_restricted_grants"]})

Acquiring global read lock

Global read lock acquired

Gathering information - done

All transactions have been started

Locking instance for backup

Data dump for table `sakila`.`payment` will be written to 1 file

Data dump for table `sakila`.`film_text` will be written to 1 file

Data dump for table `sakila`.`film_actor` will be written to 1 file

Data dump for table `sakila`.`rental` will be written to 1 file

Data dump for table `sakila`.`inventory` will be written to 1 file

Data dump for table `sakila`.`category` will be written to 1 file

Data dump for table `sakila`.`address` will be written to 1 file

Data dump for table `sakila`.`city` will be written to 1 file

1 thds dumping - 100% (46.27K rows / ~46.27K rows), 1.61K rows/s, 96.26 KB/s uncompressed, 0.00 B/s compressed

Duration: 00:00:01s

Schemas dumped: 1

Tables dumped: 16

Uncompressed data size: 2.92 MB

Compressed data size: 557.10 KB

Compression ratio: 5.2

Rows written: 46273

Bytes written: 557.10 KB

Average uncompressed throughput: 1.87 MB/s

Average compressed throughput: 356.39 KB/s

 MySQL  azuremysql-dbs.mysql.database.azure.com:3306 ssl  JS >

 

You may cross check whether DUMP has been exported to Oracle Cloud Object Storage or not

 





Import Dump file into MySQL Database Service (MDS), from OCI Object Storage

 

First is to install OCI CLI by following below web links

https://docs.oracle.com/en-us/iaas/Content/API/SDKDocs/cliinstall.htm

 For import into a MySQL DB Service (MDS), the MySQL Shell instance where you run the dump loading utility must be installed on an Oracle Cloud Infrastructure Compute instance that has access to the MySQL DB System.

Let’s connect Oracle Cloud Compute Instance ,a nd check whether MDS Instance you are able to connect or not ?

How to connect Compute Instance- open Putty and Public IP into it , see below screen shot.
















How to Deploy MDS Instance ? Below is my another blog which will help us to get through.

https://mysqlsolutionsarchitect.blogspot.com/2020/09/how-to-launch-mysql-database-service.html

Let’s Connect MDS instance ,

sudo su root

mysql -h10.0.1.3 -uadmin –pXXXXX

 

mysql> show databases;

+--------------------+

| Database           |

+--------------------+

| information_schema |

| mysql              |

| performance_schema |

| sys                |

+--------------------+

4 rows in set (0.00 sec)

 

Connect MySQL Shell to Import

# mysqlsh -h10.0.1.3 -uadmin -pWelcome1!






DRY RUN scripts:-

MySQL  10.0.1.3:33060+ ssl  JS >util.loadDump("sakilaDBdump",{dryRun: true, osBucketName: "chandanBucket", osNamespace: "idazzjlcjqzj","ociConfigFile":"/root/.oci/config"})

OUTPUT

 MySQL  10.0.1.3:33060+ ssl  JS > util.loadDump("sakilaDBdump",{dryRun: true, osBucketName: "chandanBucket", osNamespace: "idazzjlcjqzj","ociConfigFile":"/root/.oci/config"})

Loading DDL and Data from OCI ObjectStorage bucket=chandanBucket, prefix='sakilaDBdump' using 4 threads.

Opening dump...

dryRun enabled, no changes will be made.

Target is MySQL 8.0.23-u2-cloud. Dump was produced from MySQL 8.0.15

Fetching dump data from remote location...

Fetching 16 table metadata files for schema `sakila`...

Checking for pre-existing objects...

Executing common preamble SQL

Executing DDL script for schema `sakila`

[Worker001] Executing DDL script for `sakila`.`city`

[Worker003] Executing DDL script for `sakila`.`category`

[Worker002] Executing DDL script for `sakila`.`sales_by_store` (placeholder for view)

[Worker003] Executing DDL script for `sakila`.`staff_list` (placeholder for view)

[Worker000] Executing DDL script for `sakila`.`actor_info` (placeholder for view)

Executing common postamble SQL

 No data loaded.

0 warnings were reported during the load.

 MySQL  10.0.1.3:33060+ ssl  JS >

 

 Script to Import DUMP file from Object Storage to MySQL Database System


util.loadDump("sakilaDBdump",{osBucketName: "chandanBucket", osNamespace: "idazzjlcjqzj","ociConfigFile":"/root/.oci/config"})

 

OUTPUT

 

MySQL  10.0.1.3:33060+ ssl  JS > util.loadDump("sakilaDBdump",{osBucketName: "chandanBucket", osNamespace: "idazzjlcjqzj","ociConfigFile":"/root/.oci/config"})

Loading DDL and Data from OCI ObjectStorage bucket=chandanBucket, prefix='sakilaDBdump' using 4 threads.

Opening dump...

Target is MySQL 8.0.23-u2-cloud. Dump was produced from MySQL 8.0.15

Fetching dump data from remote location...

Fetching 16 table metadata files for schema `sakila`...

[Worker000] sakila@payment@@0.tsv.zst: Records: 16049  Deleted: 0  Skipped: 0  Warnings: 0

Executing common postamble SQL

 15 chunks (46.27K rows, 2.92 MB) for 15 tables in 1 schemas were loaded in 2 sec (avg throughput 1.46 MB/s)

0 warnings were reported during the load.

 MySQL  10.0.1.3:33060+ ssl  JS >

Finally Import Has been done successfully.

Let’s verify The results

MySQL  10.0.1.3:33060+ ssl  JS > \sql

Switching to SQL mode... Commands end with ;

 MySQL  10.0.1.3:33060+ ssl  SQL > show databases;

+--------------------+

| Database           |

+--------------------+

| information_schema |

| mysql              |

| performance_schema |

| sakila             |

| sys                |

+--------------------+

5 rows in set (0.0020 sec)

 MySQL  10.0.1.3:33060+ ssl  SQL > use sakila;

Default schema set to `sakila`.

Fetching table and column names from `sakila` for auto-completion... Press ^C to stop.

 MySQL  10.0.1.3:33060+ ssl  sakila  SQL > select count(*) from city;

+----------+

| count(*) |

+----------+

|      600 |

+----------+

1 row in set (0.0025 sec)

 MySQL  10.0.1.3:33060+ ssl  sakila  SQL > show tables;

+----------------------------+

| Tables_in_sakila           |

+----------------------------+

| actor                      |

| actor_info                 |

| address                    |

| category                   |

| city                       |

| country                    |

| customer                   |

| customer_list              |

| film                       |

| film_actor                 |

| film_category              |

| film_list                  |

| film_text                  |

| inventory                  |

| language                   |

| nicer_but_slower_film_list |

| payment                    |

| rental                     |

| sales_by_film_category     |

| sales_by_store             |

| staff                      |

| staff_list                 |

| store                      |

+----------------------------+

23 rows in set (0.0019 sec)

 MySQL  10.0.1.3:33060+ ssl  sakila  SQL >

 Conclusion:-

Migration happened successfully!!!

MySQL Shell utility makes MySQL easier to use, by providing an interactive MySQL client supporting SQL, Document Store, JavaScript & Python interface with support for writing custom extensions.

MySQL Database Service is a fully managed database service that enables organizations to deploy cloud-native applications using the world's most popular open source database. It is 100% developed, managed and supported by the MySQL Team.