Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Managing database storage effectively is critical to keeping applications performant and scalable. This tutorial will walk through a Python script that dynamically increases the storage capacity of an Azure SQL Database by a specified percentage. We will use the Azure SDK for Python to interact with Azure SQL, applying best practices to handle authentication, logging, and error handling.
pip
“pip install azure-identity azure-mgmt-sql
“The script performs the following steps:
DefaultAzureCredential
, supporting various authentication methods.Here’s the complete Python script to accomplish this:
import math
import logging
from azure.identity import DefaultAzureCredential
from azure.mgmt.sql import SqlManagementClient
from azure.core.exceptions import AzureError
# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
# Input variables
subscription_id = "{ YOUR SUBSCRIPTION }"
resource_group_name = "{ RESOURCE GROUP NAME }"
server_name = '{ SQL SERVER NAME }'
database_name = '{ DATABASE NAME }'
# Percentage increase for the database size
percentage_increase = 20
def main():
try:
# Authenticate with Azure
logger.info("Authenticating with Azure...")
credential = DefaultAzureCredential()
sql_client = SqlManagementClient(credential, subscription_id)
# Get current database details
logger.info(f"Fetching current properties for database: {database_name}...")
database = sql_client.databases.get(
resource_group_name=resource_group_name,
server_name=server_name,
database_name=database_name
)
# Calculate the new size in GB
current_max_size_gb = database.max_size_bytes / (1024 ** 3)
logger.info(f"Current max size: {current_max_size_gb} GB")
new_max_size_gb = math.ceil(current_max_size_gb * (1 + percentage_increase / 100))
new_max_size_bytes = new_max_size_gb * (1024 ** 3)
logger.info(f"New max size (rounded): {new_max_size_gb} GB")
update_params = {
'max_size_bytes': int(new_max_size_bytes)
}
# Update database size
logger.info("Updating database size...")
updated_db = sql_client.databases.begin_update(
resource_group_name=resource_group_name,
server_name=server_name,
database_name=database_name,
parameters=update_params
).result()
logger.info(f"Updated max size to: {updated_db.max_size_bytes / (1024 ** 3)} GB")
except AzureError as e:
logger.error(f"An error occurred while interacting with Azure: {e}")
except Exception as e:
logger.error(f"An unexpected error occurred: {e}")
if __name__ == "__main__":
main()
logging
library to log each significant step.DefaultAzureCredential
provides seamless authentication, including support for environment variables and managed identities.sql_client.databases.get()
function retrieves the database’s current size, in bytes.math.ceil
function ensures we round up to the nearest GB to avoid insufficient allocation.begin_update()
method applies the updated size to the database.To run the script, use the following command:
python update_database_size.py
This script is a useful tool for automatically scaling your database storage, ensuring you can handle increased data volumes without manual intervention. This approach is especially beneficial in scenarios where usage patterns are highly dynamic, allowing for seamless storage expansion with minimal downtime.
INFO
, WARNING
, ERROR
) based on the deployment environment.This solution provides a practical way to manage SQL database storage, ensuring that your applications continue running smoothly as data grows.