(lively music) Hello and welcome. Please enjoy this short readiness video on Azure Database for PostgreSQL. With us now, we have Gary Hope. Hi, Gary, how you doing today? Hi Sanjay, good to be here. Of course great to talk to you every time I talk to you. So it’s all good. So can you please tell us, what are you gonna talk about today? So today I’m gonna talk about migrating databases to Azure database for Postgres. And before we get started, what’s your role at Vegas, what do you do? So I’m the technical product manager for Azure database for Postgres. Yeah. Alright, that’s exciting. So let’s get started. So, when we think about Azure Database for Postgres, we are primarily moving workloads that are living in customers data centers. That’s the primary place for the work that’s come from. And as Postgres has become increasingly popular for enterprise workloads, enabling those migrations to be super smooth and easy for our customers has been very important for us. Of course. So let’s get started. Why would they I mean, of course, Omar talked about all the fun things, but what are the benefits? So I think Omar talked about a lot of those things, but we’ll quickly recap, customers are really wanting to take advantage of the platform as a service capability that we’ve got the embedded intelligence we’re building into that platform. But continuing to leverage that database that they love this Postgres database that they love, leveraging the extensions that they use, based on that open source community that supports them. I see, okay. And then obviously, we’ve got the extensions that enable us to do upper scale situs that was always back quite a bit. So you know, migrations are they sound interesting, but then when they actually started do migration, they were pretty hard to do. So we’ve tried to make that as simple as possible. What we’ve got is an Azure database migration guide. And this is not just about the scenario we’re talking about here today. But all migration scenarios, whether you want them to migrate your sequel databases up into Azure, or your Postgres databases that we’re talking about today, or MySQL, or whatever that is. So common tools for all migrations like that. So a common set of tools, but also detailed instructions on how to do it. So it’s a step by step guide. If you go to the Azure database migration guide, you will see a step by step guide in terms of how to do those migrations. And we’re gonna work through one of those scenarios today for Postgres. You’re gonna show live demo? We’ll show live demo, we’ll get that to you. So let’s talk a little bit about what that migration actually looks like. So you have a database that’s running in your data center. And we mustn’t forget these databases are not just databases, they have applications attached to them. Yes, of course. So you’re gonna wanna migrate your application, and then you’re gonna wanna migrate your database up to Azure to take advantage of this scenario. It’s crazy. We have a new Azure migrate portal experience, where we can enable all of those. So that data migration scenario that I spoke about earlier, it’s just a piece of your migration. You wanna maybe be able to do all of that, including your applications as well. As we enable all of that through the Azure migrate portal. But let’s dive into the details of the actual Postgres mode. There are two options, right? Yep, we have two options, we have an option that is gonna enable you to pick up your database once, do a dump of that database into standardized dump file format, which in essence, is just a script of your database. And you can pick that up and run it in Azure database for Postgres. So that then creates a copy of your database. But while you’re doing that dump, While you’re doing that load process and obviously If we’re doing from On-premises to the cloud, while you’re transferring that data, your database’s down until you get the new one up and running. So that’s the one option. It is useful for some scenarios, it’s useful for testing in a lot of cases. But also for super small database, that can be a very quick and easy way to do this. But we also have Azure database migration services, which allows us to do an online migration. It also enables offline migrations but today, we’ll be talking about that online migration, where you want your database to be running all the time, while you’re doing that migration in the back end and then having this short amount of downtime to be able to cut over from one database to the other. Oh, this is like an example of a retail store, customers are always there online or in person stores. Exactly. If you gonna hold on for two days, we taking it down while we migrate. It’s not gonna be a great experience. So the demo I’m gonna show you today is really the step by step process. If we have an application that’s connected to an On-premises database, what we’re going to want to do is migrate the data as part of an initial load. Now there’s a first step to that process of that initial migration. And that is that we need to set this environment up with the prerequisites to be able to take that data in. And the way that we do that is with a schema migration. In the same way, we would do a dump and a load. For the schema migration, we just do a dump of the schema and a load of that schema. And when we’re migrating from Postgres to Postgres, that process is super simple. You just dump the schema, we get a copy of that schema, we reload it into the environment. And I’ll show you some scripts on how to do that as part of the demos. So the schema or the data as well? Just the schema. So your data staying in your On-premises database, we’ve now created a copy of the schema. So basically, this database that’s in Azure, now looks the same without the data. And then we use an initial load as part of the data migration services. So we establish pipeline, we load that data. And then after that we enable new transactions to continuously flow into the database. So as your retail example that you picked up now– New customers in the store now. Yeah, new customers in the store. Those transactions continue on the old database, but the data that they’re creating is written into the new database that you’re gonna switch over to. At some point, we need to do this cut over. So we do need to stop data coming into the database in the store in your On-premises deployment. When you stop that data, we need to redirect the client to the new server once the last transaction has caught up. So this process is riding the data in the background, needs to catch up and then we will cut over the actual connection to the the new database. That’s the process end to end of migrating application and its connections to that new database. Without losing that those few transactions that came into the loop that– You lose no transactions The only thing is you have the short amount of downtime, while you reconfigure your connection strings for the application to pointed at the new database. Awesome. Let’s see it. So let’s jump into this demo, and let me show you what we’ve got going here. I have a VM set up with here you go, a local Postgres instance, on the platform. And what tool are we showing, Azure Data Studio? So this is Azure Data Studio, it is got the Postgres extension installed in it. So this is Native Postgres Support in Azure Data Studio. That’s awesome. So here I have a local host running on this machine. It has a sample database. In here we have some tables. There’s a scheduling app, that’s basically a bunch of tasks that are running on here. If I have a look at the tasks that are in here, you can see there’s a whole bunch of tasks in here, If I go and have a look at how many, there’s 161 of them. And we wanna migrate that while we continue to operate on this database to the new database. So one of the things we’re gonna do is we’re gonna simulate data arriving. And I’m gonna run this data load application at a simple power shell script. And I’ll give you a little bit of a view of what’s in there. Is every thing new customized by instant new products? Correct. We do a clean up just to get rid of all the excess records that were in there for this test. So there’s a bunch of deletes that happen, and then we inserting row by row. So now if we go back and have a look you’re gonna see that We deleted the whole lot up front, but now as as the new ones come in, we’re gonna see more dual rows arrive in. Right? So what we’re gonna want to do now is migrate this application across to our new Postgres as Postgres database on Azure. So here, I have the same setup, if I go back to this database here, the same setup. But you can see there are no tables in this because I haven’t done that initial schema migrations. So let’s do that initial schema migration again, of a simple script to do that migration and I’ll run this and basically, it’ll export the schema and load the schema into this Azure database. Now, if you have a look, we have a table, but we have no data.
It’s empty. Because as I said, we migrated the schema, the only thing that we moved over was that schema for that database. So basically, on the left hand side, the server’s local host is On-prem and this one is in the cloud? Correct. And it’s continue to work. The important thing that we need to remember is that we’re wanting that database to continue to work. So let’s go and have a look at actually doing the migration. So if I go across to my Azure data migration services, what I’ve done–
How do you go there? Like portal at Azure.com? This is Azure Portal, there’s a CLR interface for it as well. So if you wanting to interact with it via the CLR, you can actually do all of this configuration via CLR. In this case, I’m using the portal just so people can get an experience of visual experience– So when people go to portal at azure.com, what do they type over there to get here? Azure data migration services, you add a new resource, azure data migration services. That’s in essence what I’ve done. It will allow you to create a new project. And when you create that new project, all you’re gonna do is specify your source and destination databases for that new project. And then you will have configuration that looks like this. Inside here, what we’re now gonna do is we’re actually gonna create an activity. That activity is gonna be that pipeline that enables us to do that database migration live. So let’s do an online data migration. I’m gonna specify my source database. As I mentioned, it’s a Postgres in a VM. So this scenario is… You have to provide it with a password. They’ll do a bit of validation in the background to make sure that it can connect to that, that instance. There’s a password Probably filled. So there’s we connected to our source database. Now we’re gonna connect to the target database. This is the database that we’ve now set up in Azure that’s got a got a schema in it, but no data. So that’ll go again, make sure it can connect, do some validation. And then it allows us to choose the database that we want. We’re gonna just do that sample database today. It gives us some options to set up some configurations that are not important ready. This is around limiting the size of your LOBs and how many processes you want to do in parallel. How many tables you have to link. Especially those big one in size.
Then you mark right here. In our scenario, we’ve only got that one table, so it doesn’t matter. But if you’ve got a real database in production, you’re gonna have a lot of tables in there. I’m gonna call this Sample DB migrate. And we’re gonna run that migration. So now what’s gonna happen, is it’s gonna go back to that database, do all the validation against those connections that we’ve got, identify the tables, and build up a migration plan or pipeline for this migration to to run into end. And here, you can see it’s initializing the pipeline. What it does is it takes a couple of seconds. But while it does this, I wanna show you what we’ve done when we ran those scripts that we were talking about earlier, I just ran them. I never showed you actually what’s inside them. So the schema migration basically does a pg dump. So that’s a utility that we have with Postgres, that allows you to dump the database. We specify a parameter that says it needs to be the schema only. Not the data set.
Not the data set. And then we do an import using just a straight PSequel command, we rerun that script that it generates as part of that dump. And that’ll go create all the objects inside there. So that’s the one script that we grant. And then the other one that we we’ve done is we’ve got the Create Load. And this crate load is just a loop that runs through and creates a record. We saw the results of it. But you can see, it’s just super simple things. So let’s go back to VM. Over here we have, it is now running. What we’ll see is that there was a full load that was completed. Because we’ve had that load running in the background. There’s been a fair amount of data generated. Just in a little bit of time, we have 355 rows, and we have an incremental sink. So after it did that full load, we’ve got 46 rows applied. And if we go back to our database itself, and we Have a look here, what we’ll see is we’ve got 446 rows. It should be pretty close to what we’ve got in our source database now, 455. So there’s a little bit of lag as it tries to migrate that data, there will be bandwidth restrictions If you’re migrating from On-premises, it can take a little bit of time. So what we’re gonna wanna do is stop that process that’s generating the data. We’re now ready to cut over. So we’re gonna stop the process that’s creating the data. And we are going to finalize the cut over. The moment has arrived.
The moment has arrived. So when you start this cut over, it’ll tell you that there’s Pending Changes and it’ll slowly bring those pending the changes down to zero. So it will allow us to update that to a point to continue to update in the background to a point where we’re ready to cut over Within a few seconds it should update. And once it’s finished that, that migration, it’s going to enable you to disconnect from the On-premises database. So what’s happening in here doesn’t matter. You see, it’s gone down to zero, we’re gonna click Apply. And now once this is applied, the only remaining step that you have is to reconfigure the connection strings and get over there. Oh, wow. That’s exciting. That’s how simple it is to migrate databases across to Azure. And the application will now connect to the new database. And that will reconfigure the connection strings, the applications are on on the new database. So much easier than what I used to do before. So this is awesome. So I’ve shared some resources for customers wanting to do this. We’ve got some great tutorials update, you can go through step by step and achieve exactly the same thing in exactly the same way. Fantastic. Thank you so much Gary for your time here. Thank you Sanjay for having me. Great. Thanks for watching this short Azure Readiness video.