Microsoft's Office 365 services provides a great opportunity for businesses to leverage fluid scalability capabilities to meet their needs. Moving off an on-premise Exchange server can be daunting, however utilization of the Office 365 Import service to import PST files to user mailboxes in the cloud can be of great help to accomplish this.
This Step-By-Step details the use of Office 365 Import Service to upload an un-encrypted PST File to Office 365.
The account completing the import requires to be assigned the Mailbox Import Export role. This is accomplished by adding the role to the Organization Management role group.
- Open Exchange Admin Center
- Click Permissions
- Click Admin Roles
- Double Click Organization Management
- Click + Roles
- Select Mailbox Import Export
- Click Add
- Click OK
- Click Save
Alternatively, you can create a new role group and assign your account permissions.
The PST files that are being imported need to be stored on a network file share or a file share on your local PC. Take note the syntax in the later steps with AZCopy.exe.
Native Azure Storage support for using SSL to access blobs at custom domains is still on our backlog. We would love to hear about your scenarios where using the Azure CDN is not an acceptable solution, either by posting on this thread or sending us an email at [email protected]. Dec 8, 2016 - Copy blobs between storage accounts using AzCopy. Cd 'C:Program Files (x86)Microsoft SDKsAzureAzCopy' #Get to AzCopy dir $source. Mindaugas on Azure Let's Encrypt Extension certificate renewal error; Mathieu.
Storage Key and Upload URL –
NOTE: During this process you are provided a storage key and an upload URL. Secure these and treat them as you would a password. Anyone can upload to your tenant should they fall into the wrong hands.
- Navigate to https://protection.office.com
- Sign in with a Global Admin account for your organization
- Click Data Management
- Click Import
- Click Go to the Import Service
- Click
- Click Upload Files Over the Network
- On the popup page, click Download Tool (Azure AzCopy tool)
- Click Run
- Click Next
- Agree to the EULA and click Next
- Accept the default install location
- Click Next
- Click Install
- Click Yes
- Click Finish
Next we need to get the upload secure key and the URL prior to the AzCopy tool can be used to upload the PST file(s) to Office 365.
- Open the Import Data to Office 365 page that opened in the pervious step
- Click the icon
NOTE: This is a secure key and URL. Make sure that it’s kept secure. - Click Copy Key (this process can take up to 5 minutes to complete)
- Click Show URL for PST Files
- Copy the key and URL for use in the next step
The PST files can now be uploaded once the AzCopy tool downloaded and installed and the secure key and URL have been acquired.
- Open a command prompt as an admin (on the machine where you installed AzCopy)
- Open the directory where you installed AzCopy
- Run the following command to start uploading the PST Files
AzCopy.exe /Source:SERVER01PSTshare /Dest:<Insert URL Here>/SERVER01/PSTshare/ /Destkey:<Insert Secure Key here> /S /V:C:PSTshareUploadlog.log
NOTE:
SERVER01PSTshare – This denotes the share in which your PST File(s) are placed. If there are multiple PST Files in this location, AzCopy will upload them all
C:PSTshareUploadlog.log – This denotes a location on the local machine where the verbose log file can be written
Next the CSV files need to be created which will map the PST file to the mailbox in Office 365 after the PST Files are uploaded to Office 365.
- Download the PST Mapping Template File from Microsoft
- Complete the CSV file with your specific information, filling in as many lines as needed. One line per PST file uploaded
NOTE: Further explanation on the PST Mapping File can be found here. - Save the PST Mapping File
Navigate to https://protection.office.com and sign in with a Global Admin account for your organization.
- Click Data Management
- Click Import
- Click Go to the Import Service
- Click
- Click Upload Files Over the Network
- Check * I’m done uploading my files
- Check * I have access to the mapping file
- Click Next
- Enter a Job Name
- Click Next
- Click + to Add the Mapping File
- Validate the Mapping File (Under 100 rows)
- Agree to the terms and conditions
- Click Finish
- Click Closed
The import will now start.The status of the import can be checked by navigating to the Office 365 Admin Center and opening the Import tab. Use the refresh button to get the updated status.
Monitor the status column for completion or error. Clicking on the job and the selecting View Details will allow you to troubleshoot the status message. The example below shows one corrupted mail item and this was discovered with a detailed log provided with the upload.
12/4/2018
Welcome to version 1.6.0 of Microsoft Azure Storage Explorer. This release focuses on supporting RBAC for Blobs and ADLS Gen2 Storage accounts.
New
- You can now use Storage Explorer to access your Blob data via RBAC. If you are signed in and Storage Explorer is unable to retrieve the keys for your Storage account, then an OAuth token will be used to authenticate when interacting with your data.
- Storage Explorer now supports ADLS Gen2 Storage accounts. When Storage Explorer detects that hierarchical namespace is enabled for a Storage account, you will see '(ADLS Gen2 Preview)' next to the name of your Storage account. Storage Explorer is able to detect whether or not hierarchical namespace is enabled when you are signed in, or if you have attached your Storage Account with name and key. For ADLS Gen2 Storage accounts, you can use Storage Explorer to:
- Create and delete containers.
- Manage container properties and permissions (left-hand side).
- View and navigate data inside of containers.
- Create new folders.
- Upload, download, rename, and delete files and folders.
- Manage file and folder properties and permissions (right-hand side).
Other typical Blob features, such as Soft Delete, and Snapshots, are not currently available. Managing permissions is also only available when signed in. Additionally, when working in an ADLS Gen2 Storage account, Storage Explorer will use AzCopy for all uploads and downloads and default to using name and key credentials for all operations if available. - After strong user feedback, break lease can once again be used to break leases on multiple blobs at once.
Known Issues
- When downloading from an ADLS Gen2 Storage account, if one of the files being transferred already exists, then AzCopy will sometimes crash. This will be fixed in an upcoming hotfix.
- Detatching from a resource attached via SAS URI, such as a blob container, may cause an error that prevents other attachments from showing up correctly. To work around this issue, just refresh the group node. See #537 for more information.
- If you use VS for Mac and have ever created a custom AAD configuration, you may be unable to sign-in. To work around the issue, delete the contents of ~/.IdentityService/AadConfigurations. If doing so does not unblock you, please comment on this issue.
- Azurite has not yet fully implemented all Storage APIs. Because of this, there may be unexpected errors or behavior when using Azurite for development storage.
- In rare cases, the tree focus may get stuck on Quick Access. To unstick the focus, you can Refresh All.
- Uploading from your OneDrive folder does not work because of a bug in NodeJS. The bug has been fixed, but not yet integrated into Electron. To workaround this issue when uploading to or downloading from a blob container, you can use the experimental AzCopy feature.
- When targeting Azure Stack, uploading certain files as append blobs may fail.
- After clicking 'Cancel' on a task, it may take a while for that task to cancel. This is because we are using the cancel filter workaround described here.
- If you choose the wrong PIN/Smartcard certificate, then you will need to restart in order to have Storage Explorer forget that decision.
- Renaming blobs (individually or inside a renamed blob container) does not preserve snapshots. All other properties and metadata for blobs, files and entities are preserved during a rename.
- Azure Stack does not support the following features:
- File shares
- Access tiers
- Soft Delete
Attempting to use these features while working with Azure Stack resources may result in unexpected errors. - The Electron shell used by Storage Explorer has trouble with some GPU (graphics processing unit) hardware acceleration. If Storage Explorer is displaying a blank (empty) main window, you can try launching Storage Explorer from the command line and disabling GPU acceleration by adding the
--disable-gpu
switch:./StorageExplorer.exe --disable-gpu
- For Linux users, you will need to install .NET Core 2.0.
- For users on Ubuntu 14.04, you will need to ensure GCC is up to date - this can be done by running the following commands, and then restarting your machine:
sudo add-apt-repository ppa:ubuntu-toolchain-r/test
sudo apt-get update
sudo apt-get upgrade
sudo apt-get dist-upgrade
- For users on Ubuntu 17.10, you will need to install GConf - this can be done by running the following commands, and then restarting your machine:
sudo apt-get install libgconf-2-4