This current article is the second article in a series of two articles. In the former article, we review in high level the components and the concepts that involved throughout the process of importing PST files to Exchange Online mailboxes.
The current article, serve as a step by step guide which includes a detailed description of each step that is required for implementing the task of importing PST files to Exchange Online mailboxes.
Phase A – Upload the PST files to Azure store
In the following section, we review the “step by step” process of the scenario in which we want to import 2 PST files to a specific Exchange Online mailbox of a user named – Bob.
1. Prepare the PST files and the shared folder
Technically speaking, we can store the organization users PST file on every host whom we would like as long as the AzCopy utility, will be able to access this host.
If we store the PST files on a specific network host, we need to share the folder which contains the PST files.
Regarding the folder name who stores the PST file and regarding the PST file names, I recommended using a simple naming convention. The purpose is to prevent in advance any “import issues,” that related to a problematic PST file name.
The AzCopy utility will need to “know” the path of the PST files. The path is written using a UNC (Universal Naming Convention) which is based on the following syntax:
\\HOST-NAME\Shared folder name.
In our example, I create a folder named PSTimport on drive C: and copy all the PST files to the specified folder. In addition, we need to share this folder. The host name who hosts the PST file is – SRV05.
The UNC path that we will use for the AzCopy utility will be written in the following way: \\SRV05\PSTimport
Sharing the folder that contains the PST files
Although we should be familiar with the operation of sharing a folder, just to be on the safe side, a quick guide:
- Right-click on the folder that you want to share (PSTimport in our example)
- Select the Sharing tab
- Select the – Advance Sharing…
- Select the checkbox – Share this folder
- Select the Everyone group
- Select – Read
- Select – OK
In the following screenshot, we can see the PSTimport folder that includes the PST files that will be uploaded to the Azure store later.
2. Get the URL address of the Azure store (SAS)
Before we can start to run the PST file import job, we need to get the URL address of the Azure store which will “host” the PST files that we will upload to the cloud.
To be to get the required URL address value, we will start a NEW PST file import job, only for getting the value of the SAS URL address (we will not complete this specific PST import job).
After we get the required SAS URL value, and later, we will use this address as part of the configuration settings that we need to set when using the Azure Storage Explorer utility.
1. Login to Office 365 security and compliance portal
We can access the Office 365 Security & Compliance portal by using the URL address (https://protection.office.com/#/homepage ) or, from the Office 365 admin portal.
- On the main page of the Office 365 admin portal, select the menu Admin centers
- Select the sub menu – Security & Compliance
- On the left menu bar, select the menu – Data governance
In the next step, we will start the import PST wizard, just for getting the value of the SAS URL address.
- Under the Data governance menu, select the submenu – Import
- Click on the + New import job button
- Type a name for the NEW import PST job.
- Click Next
Quick reminder, keep it simple, select a simple name without capital letters, spaces, special characters and so on.
The default option is – Upload your data
- Click Next
Under section “2” (Copy the SAS URL for network upload. You’ll use this in the Dest parameter), click on the link named – Show network upload SAS URL
After a couple of seconds, the SAS URL address is generated.
- Copy the SAS URL address by clicking on the button – Copy to clipboard
My advice is to save the value of the SAS URL in a file so, in the future, we will be able to easily get this value.
3. Using AzCopy tool for upload the PST files
In this step, we need to complete two tasks:
- Task 1 – download + install the “AzCopy” utility.
- Task 2 – prepare the AzCopy command syntax using the required parameters and run the command via the “AzCopy” utility.
Task 1 – download + install the “AzCopy” utility.
Download and install the AzCopy tool. The installation wizard is quite simple and after the installation is completed, the AzCopy files will be located on the following path:
C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy
Task 2 – prepare the AzCopy command syntax using the required parameters and run the command via the “AzCopy” utility.
The AzCopy command syntax is quite simple but can be a little confusing when using the different parameters for the first time.
The two mandatory parameters that we need to define for the AzCopy command line are -“Source” and “Dest” (destination).
The “/Source:” parameter defines the location (path) to the PST files. AzCopy command needs to know what is the exact location of the PST files that will be uploaded to the “cloud” (to the Azure store).
The “path” to the PST file is written by using the UNC naming convention that defines the Hostname who includes the shared folder + the shared folder name who contains the PST files.
As mentioned earlier, we will need to share the folder containing the PST files in advance.
In our example, the hostname is – SRV05 and the shared folder name is PSTimport.
The path that we define for the AzCopy command, will be written in the following way.
/Source:\\srv05\PSTImport
The “/Dest:” parameter defines the destination location that will store the PST file. In other words, the Azure store address to which the PST files will be copied.
The information about the Azure store address is implemented via URL address that defines as SAS.
The basic assumption that we already have the required URL address (we review how to get this URL address in the former section).
An example of the syntax that we use is
1 | /Dest:"https://376defafa7ca5ab3aee104e.blob.core.windows.net/ingestiondata?sv=2012-02-12&se=2018-01-25T02%3A28%3A40Z&sr=c&si=IngestionSasForAzCopy201712251737533592&sig=CIwhEtyY7qLQiFP3%2B7wRuSXwCvtz14bQHzAn2t8F6lw%3D" |
Notice that we use the double quote mark before and after the “original SAS URL address”
The “/V:” parameter is used for defining the path of a LOG file that will include event of success or failure of the PST import process. This is not a mandatory parameter.
In our example, we want to create the LOG file in drive C: Temp folder and the Log file name is importPST.log
An example of the syntax that we use is:
/V:”C:\temp\importPST.log”
In the following diagram, we can see an example of the “full syntax” that we provide to the AzCopy command:
Option 1 – copy and paste the AzCopy command syntax in a command line window
When using the standard windows command line for running the AzCopy command line my recommendation is prepared in advance the required command syntax and saved the syntax in a text file. We will copy the command syntax and in the second step, pate the command syntax on the command window.
Step 1#2
The be able to run the AzCopy command line, with the require syntax, we can use the command windows (start run and cmd).
We will need to navigate to the path of the AzCopy.exe file.
For example:
1 | cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy" |
Step 2#2
In the command window, paste the command syntax that was prepared in the former step.
An example of the syntax that we use is:
1 | AzCopy.exe /Source:\\SRV05\PSTImport /Dest:"https://376defafa7ca4ab3aee104e.blob.core.windows.net/ingestiondata?sv=2012-02-12&se=2018-01-25T02%3A28%3A40Z&sr=c&si=IngestionSasForAzCopy201712220737533592&sig=CIwhEtyY7qLQiFP3%2B7wRuSXwCvtz13bQHzAn5t8F6lw%3D" /V:"C:\temp\importPST.log" |
In the following screenshot, we can see that the AzCopy manage to run the command, locates the PST file and successfully uploads the PST file to the Azure store.
Option 2 – Run the AzCopy command syntax using PowerShell console and variables
Although we can run the AzCopy command line from a command window, the “long” syntax of the Azure store URL address (described as SAS) can cause some problems when using the command window (space etc.).
For this reason, my preferred option is to run the AzCopy command from the windows “graphic” PowerShell console – the windows PowerShell ISE.
An additional change in the way, then we run the command is, that in this case, we use variables that store the values that we want to use such as the path of the PST files and the URL address of the Azure store.
In the following screenshot, we can see in the way that we use the Windows PowerShell ISE with the require variables.
An example of the PowerShell syntax that we use is:
1 2 3 4 5 6 7 | cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy" $PSTFile = "\\SRV05\PSTImport" $AzureStore = "https://376defafa7ca4ab3aee104e.blob.core.windows.net/ingestiondata?sv=2012-02-12&se=2015-01-25T02%3A28%3A40Z&sr=c&si=IngestionSasForAzCopy201412250737533592&sig=CIwhEtyY7qLQiFP3%2B7wRuSXwCvtz14bQHzAn5t8F6lw%3D" $LogFile = "C:\temp\importPST.txt" & .\AzCopy.exe /Source:$PSTFile /Dest:$AzureStore /V:$LogFile |
Phase B – Import to PST files to Exchange Online Mailboxes respectively
In this phase, we assume that you already finish uploading all the required PST files to Azure store.
Now, we are going to create a new PST import job, using the Office 365 Security & Compliance admin.
The New import job includes the following steps:
- Upload the CSV file – the import PST job will use the CSV file that we provide as a “map file.” The CSV file includes information about the PST files that we uploaded to the Azure store + the name of the Exchange Online mailboxes. The PST import job will “pull” each of the PST files that mentioned in the CSV file, and import this PST files to a specific Exchange Online mailbox.
- Verify the content of the CSV file – the import PST job will scan the content of the CSV file, and verify that the information structure of the CSV file is “legal” and created using the “right syntax.”
- Import the PST files from the Azure store to the appropriate Exchange Online mailbox.
1. Preparing the CSV file
In the following screenshot, we can see an example to a CSV file that we use for “instructing” the Office 365 import PST job about the specific PST files that we want to import to a specific Exchange Online mailbox.
Generally speaking, the CSV file can include a couple of “column headers” that we can use for various scenarios.
In our scenario, I would like to emphasize only the most important columns.
1. Workload (mandatory value)
The “Exchange” value, is the default value for a scenario in which we import PST files to Exchange Online mailbox. In other words, add this value for each row and don’t “mess” with this value.
2. Name (mandatory value)
This is the name of the PST file that the Office 365 PST import job except to find in our Azure store (the PST files that were uploaded in the former steps).
3. Mailbox (mandatory value)
This is the name of the Exchange Online mailbox for which the PST file will be imported.
4. TargetRootFolder
This is the name of the folder that will be created in the “destination Exchange Online mailbox,” which will contain the imported PST file (mail items).
In case that we don’t provide any name, the Office 365 PST import batch will automatically create a folder named – “Imported” in the destination Exchange Online mailbox, which will store the content of the PST file.
In our scenario, Office 365 PST import batch will execute the following sequence of tasks:
- A PST file named pst will be imported to the Exchange Online mailbox of a recipient named – [email protected] Because we didn’t provide any folder name, the PST file will be imported into the default folder that will be created – the – “Imported” folder.
- A PST file named pst will be imported to the Exchange Online mailbox of a recipient named – [email protected] Because we provide a folder name (PST_import-Test), the Office 365 PST import batch will create this folder that will store the content of the imported PST file.
2. Create NEW import Job
In this step, we use the Office 365 security & compliance admin interface for creating a NEW import PST batch, that will import the PST files located in the Azure store to Exchange Online mailboxes based on the “instructions” in the CSV file.
- Office 365 Security & Compliance admin interface, select the menu Data governance
- Select the sub menu – Import
- Click on the blow button named + New import job
- In the Name* text box provide the batch name (quick reminder, keep the name simple, no special characters, etc.).
The default option is – Upload your data
- Click Next
Check the two option boxes
- I’m done uploading my files
- I have access to the mapping file
- Click on the blow button named “+ Select mapping file”
Locate the required CSV file that was created in the former step.
Just a quick reminder, the CSV file will need to be prepared by you and contain the list of PST files that were already uploaded to the Azure store + the name of the Exchange Online mailboxes which will “host” a specific PST file from the list.
- Select the appropriate CSV file and click Open
3. Validate the CSV file
In this step, the Office 365 import PST batch will need to verify if the CSV file syntax and structure are valid.
- Select the option – Validate
In the following screenshot, we can see that the Office 365 import PST batch “inform” us that the CSV file is proper and valid.
- Click Save
The last wizard window informs us that the “import request” was successfully “registered.” Notice that at the current time, the PST import process will not start!
- Click Close
- Click Close
4. Start the PST import process.
In this step, we actually start the PST import process.
In the following screenshot, we can see that the status of our batch is “Analysis in progress.”
Office 365 import PST batch, will need to verify if he can locate the PST file in the Azure store and perform additional background checks.
After the Analysis process was completed, the status of the PST import batch appears as “Analysis completed”
- Select the PST import batch
- Click on the blow button – Import to Office 365
- Select the option – No, I want to import everything
Quick reminder, in our scenario we don’t want to filter folders from the PST file\s.
- Click on the blow button – Import data
- Click – close
In the following screenshot, we can see that the PST import batch status is “In progress”
After the PST import batch completed, the status is “Success”
Verify that the import PST file to Exchange Online mailbox was successfully completed.
In this section, we want to verify that the “PST import batch” manage to successfully import the two PST files to Bob’s mailbox.
In the following screenshot, we can see Bob’s mailbox.
The mailbox includes two “NEW” folders:
1. Imported folder
As mentioned, the “PST import batch” is configured to automatically create this folder in the Exchange Online mailbox. The import folder will serve as a container for all the PST files that we import to the specific Exchange Online mailbox.
2. PST_import-TEST
This is a folder that also created by the “PST import batch.” However, this time, the folder name was defined within the CSV file that we use as instructions for the “PST import batch.”
In the following screenshot, we can see the content of the imported folder. We can see that the PST file that includes mail folders, and mail items were successfully imported.
We really want to know what you think about the article
The former article in the current article series


Hi there.
I have some problems with azcopy script running.
In my secure network upload SAS key I have “&” symbols, and when I run this script have problem
my SASkey with some severeal editions:
?sv=2012-02-12&se=9149-12-31T23%3A59%3B59Z&sr=c&si=IngestionSasForAzCopy201604092146162011&sig=wkf7AEkBXFeQTd%2F8i5PUjhZxDJtAN3FGqiRMzjVm3Wk%3D
Errors:
[2016/01/01 02:20:47][ERROR] The syntax of the command is incorrect. The supplied storage key (DestKey) is not a valid Base64 string.
‘se’ is not recognized as an internal or external command,
operable program or batch file.
‘sr’ is not recognized as an internal or external command,
operable program or batch file.
‘si’ is not recognized as an internal or external command,
operable program or batch file.
‘sig’ is not recognized as an internal or external command,
operable program or batch file.
and … I will change parametr Dest to DestSAS and I have new error:
[ERROR] The syntax of the command is incorrect. Invalid SAS token in parameter “DestSAS”. Value of “se” is invalid in SAS token.
Have you found a solution for this problem? I have the same error.
Try putting the /DestSAS key in “”?
That helps, thanks 🙂
Hi There,
is there any powershell command to perform task 9/9 ?
Hi.
I would like knowHi, first at all thank you for this great article.
I’m working on Hybrid environment and I need move users to Office 365.
What happen with the cross permissions in this type of scenario?
I need import all the users that are sharing permissions between them?
For example: if I have a user A on premise and User B on premise, and A has FullAccess permissions on B mailbox.
Do I will need import both users in the same batch equal than a typical Hybrid migration?
Also, do I have any size restriction for the pst file?
Thank you.