lib/fog/azurerm/docs/storage.md in fog-azure-rm-0.1.0 vs lib/fog/azurerm/docs/storage.md in fog-azure-rm-0.1.1

- old
+ new

@@ -161,163 +161,217 @@ ## Create a storage container Create a storage container in the current storage account. ```ruby -container = azure_storage_service.create_container( - name: '<container name>' +directory = azure_storage_service.directories.create( + key: '<container name>', + public: true ) -puts "#{container.name}" +puts directory.key ``` ## List storage containers List all the storage containers in the current storage accounts. ```ruby azure_storage_service.directories.each do |directory| - puts "#{directory.name}" + puts directory.key end ``` -## Get the access control list of the storage container +## Get the access control level of the storage container Get the permissions for the specified container. The permissions indicate whether container data may be accessed publicly. ```ruby -directory = azure_storage_service.directories.get('<container name>') -access_control_list = directory.get_access_control_list('<container name>') -puts "#{access_control_list.inspect}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +puts directory.acl ``` +## Set the access control level of the storage container + +Set the permissions for the specified container. The permissions indicate whether container data may be accessed publicly. The container permissions provide the following options for managing container access: + + - container + + Full public read access. Container and blob data can be read via anonymous request. Clients can enumerate blobs within the container via anonymous request, but cannot enumerate containers within the storage account. + + - blob + + Public read access for blobs only. Blob data within this container can be read via anonymous request, but container data is not available. Clients cannot enumerate blobs within the container via anonymous request. + + - nil + + No public read access. Container and blob data can be read by the account owner only. + +```ruby +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +directory.acl = 'container' +directory.save(is_create: false) +``` + ## Delete the storage container Mark the specified container for deletion. The container and any blobs contained within it are later deleted during garbage collection. ```ruby -directory = azure_storage_service.directories.get('<container name>') -result = directory.destroy -puts "#{result}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +puts directory.destroy ``` -## Upload a local file as a blob +## Upload data as a block blob ```ruby -new_cloud_file = azure_storage_service.files.get('<Container name>', '<Blob name>').create(file_path: '<file path>') -puts "#{new_cloud_file.inspect}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +options = { + key: '<Blob Name>', + body: '<Blob Content>' +} +new_block_blob = directory.files.create(options) +puts new_block_blob.inspect ``` -## Copy Blob from one container to another +## Upload a local file as a block blob ```ruby -puts storage_data.copy_blob('<destination_container_name>', '<destination_blob_name>', '<source_container_name>', '<source_blob_name>') +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +File.open('<File Path>') do |file| + options = { + key: '<Blob Name>', + body: file + } + new_block_blob = directory.files.create(options) + puts new_block_blob.inspect +end ``` -## Download a blob to a local file +## Upload VHD data as a page blob ```ruby -blob = azure_storage_service.files.get('<Container name>', '<Blob name>').save_to_file('<file path>') -puts "#{blob.inspect}" -puts "File Size: #{::File.size <file_path>}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +options = { + key: '<Blob Name>', + body: '<Blob Content>', + blob_type: 'PageBlob' +} +new_page_blob = directory.files.create(options) +puts new_page_blob.inspect ``` -## Delete the storage blob - -Mark the specified blob for deletion. The blob is later deleted during garbage collection. - +## Upload a local VHD as a page blob ```ruby -cloud_file = azure_storage_service.files.get('<container name>', '<blob name>') -result = cloud_file.destroy -puts "#{result}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +File.open('<File Path>') do |file| + options = { + key: '<Blob Name>', + body: file, + blob_type: 'PageBlob' + } + new_page_blob = directory.files.create(options) + puts new_page_blob.inspect +end ``` -Note that in order to delete a blob, you must delete all of its snapshots. - +## Copy Blob from one container to another ```ruby -cloud_file = azure_storage_service.files.get('<container name>', '<blob name>') -result = cloud_file.destroy(delete_snapshots: 'only') -puts "#{result}" +directory = azure_storage_service.directories.get('<Source Container Name>', max_results: 1) +copied_blob = directory.files.head('<Source Blob Name>').copy('<Destination Container Name>', '<Destination Blob Name>') +puts copied_blob.inspect +``` -result = cloud_file.destroy -puts "#{result}" +## Copy Blob from one uri to self +```ruby +directory = azure_storage_service.directories.get('<Destination Container Name>', max_results: 1) +copied_blob = directory.files.new(key: '<Destination Blob Name>') +copied_blob.copy_from_uri('<Source Blob Uri>') +puts copied_blob.inspect ``` -You can delete both at the same time by specifying the option. - +## Download a small blob to a local file ```ruby -cloud_file = azure_storage_service.files.get('<container name>', '<blob name>') -result = cloud_file.destroy(delete_snapshots: 'inlcude') -puts "#{result}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +blob = directory.files.get('<Blob Name>') +File.open('<File Path>', 'wb') do |file| + file.write(blob.body) +end +puts "File Size: #{::File.size <File Path>}" ``` -## Properties - -### Get storage container properties - -Get the storage container properties. The properties will not fetch the access control list. Call `get_container_access_control_list` to fetch it. - +## Download a large blob to a local file ```ruby -directory = azure_storage_service.directories.get('<container name>') -properties = directory.get_properties -puts "#{properties.inspect}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +File.open('<File Path>', 'wb') do |file| + directory.files.get('<Blob Name>') do |chunk, remaining_bytes, total_bytes| + puts "remaining_bytes: #{remaining_bytes}, total_bytes: #{total_bytes}" + file.write(chunk) + end +end +puts "File Size: #{::File.size <File Path>}" ``` -### Get storage blob properties +## Delete the storage blob -Get the storage blob properties. +Mark the specified blob for deletion. The blob is later deleted during garbage collection. ```ruby -cloud_file = azure_storage_service.files.get('<container name>', '<blob name>') -properties = cloud_file.get_properties -puts "#{properties.inspect}" +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +blob = directory.files.head('<Blob Name>') +puts blob.destroy ``` ### Set storage blob properties -Set the storage blob properties. The properties are passed in name/value pairs. +Set the storage blob properties. ```ruby -cloud_file = azure_storage_service.files.get('<container name>', '<blob name>') -properties = { - "content_language" => "English", - "content_disposition" => "attachment" -} -cloud_file.set_properties(properties) +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +blob = directory.files.head('<Blob Name>') +blob.content_language = "English" +blob.content_disposition = "attachment" +blob.save(update_body: false) ``` ## Metadata Metadata allows us to provide descriptive information about specific containers or blobs. This is simply providing name/value pairs of data we want to set on the container or blob. ### Get Blob Metadata ```ruby -azure_storage_service.files.get('<Container name>', '<Blob name>').get_metadata +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +blob = directory.files.head('<Blob Name>') +puts blob.metadata ``` ### Set Blob Metadata ```ruby -metadata = { +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +blob = directory.files.head('<Blob Name>') +blob.metadata = { "Category" => "Images", "Resolution" => "High" } -azure_storage_service.files.get('<Container name>', '<Blob name>').set_metadata(metadata) +blob.save(update_body: false) ``` ### Get Container Metadata ```ruby -azure_storage_service.directories.get_metadata('<Container name>') +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +puts directory.metadata ``` ### Set Container Metadata ```ruby -metadata = { +directory = azure_storage_service.directories.get('<Container Name>', max_results: 1) +directory.metadata = { "CreatedBy" => "User", "SourceMachine" => "Mymachine", "category" => "guidance", "docType" => "textDocuments" } -azure_storage_service.directories.set_metadata('<Container name>', metadata) +directory.save(is_create: false) ``` ### Create Recovery Vault Create a new Recovery Vault object