How to upload all files from a folder on Microsoft Azure Storage

If you are working with Microsoft Cloud, you must have had the need to move a few files (in bulk) from a folder onto the Azure Storage System.
To get started, you will need to have a Microsoft Azure Account, create a storage account, get the connection string for it.
If you are unsure about how to proceed, check out this great introductory article: How to use Blob Storage from .NET
mobile-blob-storage-account
This is a demo connection string that you will need to add to your web.config file:
<add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=srmd1983;AccountKey=JyUOu3/iv+0UMjzI/PtoHd2JKhKx4SOSSxJcsvVp95isAZH6hKpPs/AQDOPxgVXjTNGWCYCSssgiwVVun0rlWXFgJ6A==" />

In your code, import the needed Microsoft.WindowsAzure.Storage namespace. To add this DLL, install the Azure SDK package for your version of Visual Studio and add the DLL into references.

Imports Microsoft.WindowsAzure.Storage
Imports Microsoft.WindowsAzure.Storage.Auth
Imports Microsoft.WindowsAzure.Storage.Blob

Then you will need to create a method that will do all the dirty work. The files will be uploaded in a subdirectory for each folder (in this case, the subdirectory is generated by the ClientID value).

   1:  Imports Microsoft.WindowsAzure.Storage
   2:  Imports Microsoft.WindowsAzure.Storage.Auth
   3:  Imports Microsoft.WindowsAzure.Storage.Blob
   4:  Imports System.IO
   5:  Public Class Azure
   6:  Shared Function UploadAllFilesToBlob(ClientID As String) As String
   7:  Dim err As String = ""
   8:  Try
   9:  Dim storageAccount As CloudStorageAccount = CloudStorageAccount.Parse( _
  10:  ConfigurationManager.AppSettings("StorageConnectionString"))
  11:  Dim blobClient As CloudBlobClient = storageAccount.CreateCloudBlobClient()
  12:  ' // Retrieve a reference to a container. 
  13:  Dim container As CloudBlobContainer = blobClient.GetContainerReference("uploads")
  14:  '// Create the container if it doesn't already exist.
  15:  container.CreateIfNotExists()
  16:  Dim dir As CloudBlobDirectory
  17:  dir = container.GetDirectoryReference(ClientID)
  18:  '// Create or overwrite the "myblob" blob with contents from a local file.
  19:  Dim path As String = System.Configuration.ConfigurationManager.AppSettings("SavePath")
  20:  Dim di As New DirectoryInfo(path)
  21:  For Each fi As FileInfo In di.GetFiles()
  22:  Dim fileStream = System.IO.File.OpenRead(fi.FullName)
  23:  '      // Retrieve reference to a blob named the same as the uploaded file. If the file exists, it will be overwritten.
  24:  Dim blockBlob As CloudBlockBlob = dir.GetBlockBlobReference(fi.Name)
  25:  '//create or replace
  26:  fileStream.Position = 0
  27:  blockBlob.UploadFromStream(fileStream)
  28:  fileStream.Close()
  29:  fi.Delete()
  30:  Next
  31:  Catch ex As Exception
  32:  err = "Error: " + ex.Message + "<br />" + ex.StackTrace
  33:  End Try
  34:  Return err
  35:  End Function
Advertisements

Windows Azure Security Whitepaper Released

After a lot of waiting, the Windows Azure Security Whitepaper has been released.

To download the latest release, check out the link below:

Windows Azure Network Security Whitepaper – FINAL

If you would like to keep your server/application secure, make sure you follow these guidelines and these Security Best Practices for Windows Azure Apps.

 

Windows Azure Security Layers
Windows Azure Security Layers

SQL Server Azure Index Defragmentation and performance tuning

IndexFragmentation

Reorganising and rebuilding indexes is crucial to maintaining database performance; however on SQL Azure indexes are much less transparent & the standard “DBCC DBREINDEX” approach isn’t supported on the platform.

However; Chris Pietschmann has posted an article on rebuilding all the indexes in a database here.  The operation to rebuild an index is:

ALTER INDEX ALL ON [dbo].[MyTable] REBUILD

In order to do this for all indexes within the database; you can use the below script (This is a modified version of Chris’s which supports tables that are on different schemas).

DECLARE @TableName varchar(255)
DECLARE TableCursor CURSOR FOR
(
SELECT '[' + IST.TABLE_SCHEMA + '].[' + IST.TABLE_NAME + ']' AS [TableName]
FROM INFORMATION_SCHEMA.TABLES IST
WHERE IST.TABLE_TYPE = 'BASE TABLE'
)
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
PRINT('Rebuilding Indexes on ' + @TableName)
EXEC('ALTER INDEX ALL ON ' + @TableName + ' REBUILD')
FETCH NEXT FROM TableCursor INTO @TableName
END
CLOSE TableCursor
DEALLOCATE TableCursor

Experts Warned of Cloud Complexity

One of the Yale researchers has warned that cloud-based systems might melt down with the systems becoming more and more complex.

Bryan Ford has written a paper, which he is going to present to the USENIX HotCloud 2012 conference soon. The paper says that with the use of cloud computing now becoming more mainstream, major operational “meltdowns” might arise. The matter is that everything will get quite complex, and complexity will cause an accident.

Ford explained that as diverse cloud services share more fluidly and aggressively multiplexed hardware resource pools, the probability arises that unexpected things will happen, including unpredictable interactions between load-balancing and other reactive mechanisms. This may result in dynamic instabilities, also known as “meltdowns”.

According to the experts report, it was a little like the intertwining, complex relationships and structures which could promote global financial crisis. He pointed out that new cloud services may emerge, which actually resell, trade, or speculate on complex “’derivatives” like financial trading industries.

Such components will be maintained and deployed by different companies, which, due competition, won’t share details (if possible) about the internal operation of its services. As a result, the cloud industry might face speculative bubbles. The experts predict occasional large-scale failures due to composite cloud services which have weaknesses that do not reveal until those bubbles burst.

Meanwhile, there’s no solution to the problem. The only advice that the experts can give is that providers should release detailed data about their system dependencies to some special 3rd party that offers cloud reliability analysis services.