Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Hi All
The below powrshell script gives you an idea on how to import a Azure Database backup bacpac file into your azure subscription.
The comments are inline with the actions performed.
$DatabaseName = "MyDatabaseName"
Write-Host "You have to download the publish settings file inorder to continue with the deployment"
Write-Host "You will be redirected to the azure site to download the file. the download will start automatically"
$name = Read-Host 'Press Enter to Continue'
# Downloads the publish settings file from your azure subscription
Get-AzurePublishSettingsFile
$name = Read-Host 'Press Enter to Continue'
Write-Host "Browse the .publishsetttings file you have just downloaded"
$name = Read-Host 'Press Enter to Continue'
# Use file dialog to browse the publish settings file. I am pointing it to Downloads folder of user
$dialog = New-Object system.windows.forms.openfiledialog
$dialog.InitialDirectory = $home + "\Downloads"
$dialog.DefaultExt = '.publishsettings'
$dialog.Filter = 'Publish Settings|*.publishsettings|All Files|*.*'
$dialog.Multiselect = $false
$dialog.Title = "Select a Azure Subscriptions Publishsettings file"
$rc = $dialog.ShowDialog()
if ($rc -eq [System.Windows.Forms.DialogResult]::OK)
{
$dialog = $dialog.FileName
}
#note : overriding the $dialog with file name
#imports the publish settings file
# the publish settings file has the certificate which lets you access the azure subscription .
Import-AzurePublishSettingsFile $dialog
# Setting the container name. Creates a container in which the database backup file will be uploaded
# The database backup is uploaded to blob container and then imported to the database.
# Te database back up file/container will be removed after the import
$StorageAccountName = Read-Host 'Enter the Storage Account Name'
# fetching the Key from the storage account. this is needed to set the context.
$storageAccountKey = Get-AzureStorageKey $StorageAccountName | %{ $_.Primary }
#create a storage context
$StorageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storageAccountKey
# Creates the container in your storage account.
# and uploads the backup file to this container
$DbBackupContainerName = "databasebackupcontainer"
# The database backup file name. This is the bacpac file generated when exporting the database.
$DbBackupFileName = "MyDatabaseBackup.zip"
# Getting the current folder from script execution path
$scriptpath = $MyInvocation.MyCommand.Path
$dir = Split-Path $scriptpath
#Appending the subfolder name data to the current script path. All data is should be in data folder of current script.
# the database backup file is put at in the data sub folder.
$fqName = $dir + "\Data\" + $DbBackupFileName
#create the container to upload the data
New-AzureStorageContainer $DbBackupContainerName -Permission Off -Context $StorageContext
#upload the current file to the blob
Set-AzureStorageBlobContent -Blob $DbBackupFileName -Container $DbBackupContainerName -File $fqName -Context $StorageContext -Force
Write-Host "Enter the Database Server credentials in the popup window"
$name = Read-Host 'Press Enter to Continue'
# Prompt user to enter credentials for the database server.
# The password must follow standard password guidelines
# It must be a mix of alphabets, numbers and special characters. The error check has not been handled in the script here incase of password not matching standards
$credential = Get-Credential
# Create a new Server to upload the database. If you have an exisiting server skip creating new server and use servername directly in the next step.
$server = New-AzureSqlDatabaseServer -Location "East US" -AdministratorLogin $credential.UserName -AdministratorLoginPassword $credential.GetNetworkCredential().Password
# creating fully qualified servername
$fqServerName = $server.ServerName + ".database.windows.net"
# to manage the database we have to add the current ip address to the list of allowed ip addresses to the list.
# using the .Net web client object to get the ip address ad adding it as firewall rule
$wc=New-Object net.webclient
$ip = $wc.downloadstring("https://checkip.dyndns.com") -replace "[^\d\.]"
# fetchng todays date
$todaysdatetime = Get-Date
# creating a firewall rule name with a timestamp of todays date.
$firewallRulename = "ClientIpAddress" + $todaysdatetime.Year +"-" + $todaysdatetime.Month +"-" + $todaysdatetime.Day +"-" + $todaysdatetime.Hour +"-" + $todaysdatetime.Minute +"-"+ $todaysdatetime.Second +"-" + $todaysdatetime.Millisecond
#add the firewall rule
$server | New-AzureSqlDatabaseServerFirewallRule -RuleName $firewallRulename -StartIPAddress $ip -EndIPAddress $ip
# create a new datavase. which is a web edition/ you can also create database with business edition.
$database = New-AzureSqlDatabase -ServerName $server.ServerName -DatabaseName $DatabaseName -Edition "Web" -MaxSizeGB 1 -Collation "SQL_Latin1_General_CP1_CI_AS"
# making the server to allow azure services
$server | New-AzureSqlDatabaseServerFirewallRule -AllowAllAzureServices
#setting the sql context
$SqlCtx = New-AzureSqlDatabaseServerContext -ServerName $server.ServerName -Credential $credential
#get the container nto variable
$Container = Get-AzureStorageContainer -Name $DbBackupContainerName -Context $StorageContext
#import the bacpac file uploaded into the databse name mentioned.
$importRequest = Start-AzureSqlDatabaseImport -SqlConnectionContext $SqlCtx -StorageContainer $Container -DatabaseName $DatabaseName -BlobName $DbBackupFileName
#delete the container with the database backup bacpac file after import.
Remove-AzureStorageContainer -Name $DbBackupContainerName -Context $StorageContext -Force -Confirm
Comments
Anonymous
February 09, 2015
Is there any way to know when Start-AzureSqlDatabaseImport has completed? The code you list would remove the bacpac from storage before the imported completed and therefore failAnonymous
June 10, 2015
The Start-AzureSqlDatabaseImport cmdlet emits an ImportExportRequest object that can be piped into the Get-AzureSqlDatabaseImportExportStatus cmdlet. Repeatedly testing the Status property of the StatusInfo object you get from Get-AzureSqlDatabaseImportExportStatus (in a do-while loop with a suitable sleep, for example) lets you wait until the import process is actually finished before doing the storage container clean up work.