【IT168 资讯】EC2主要功能
创建包含应用、库、数据和配置的机器映像。或者使用时限配置好的模板映像。
上传映像到Amazon S3。
用Web Service配置安全性和网络访问。
启动、终止、监视映像的任意多个运行实例。
只为实际使用的资源付费。
EC2使用流程
在Amazon Web Services上注册用户,输入支付方式
下载并安装最新的Java Runtime Environment,Amazon EC2 command-line tools ,PuTTY & PuTTYgen 。
创建并下载private key文件和X.509 certificate
运行‘C:\EC2\bin>ec2-describe-images -o self -o amazon’ 来查看所有公共的镜像
运行实例 C:\EC2\bin>ec2-run-instances ami-25b6534c -k kiki-keypair
通过以下命令查看状态 C:\EC2\bin>ec2-describe-instances i-c3f31eaa
给实例授以网络访问的权限。C:\EC2\bin>ec2-authorize default -p 22
用PuTTY连接实例
中止实例 C:\EC2\bin>ec2-terminate-instances i-c3f31eaa
EC2模板映像
Standard Instances
Instances of this family are well suited for most applications.
$0.10 - Small Instance (Default)
1.7 GB of memory, 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit), 160 GB of instance storage, 32-bit platform
$0.40 - Large Instance
7.5 GB of memory, 4 EC2 Compute Units (2 virtual cores with 2 EC2 Compute Units each), 850 GB of instance storage, 64-bit platform
$0.80 - Extra Large Instance
15 GB of memory, 8 EC2 Compute Units (4 virtual cores with 2 EC2 Compute Units each), 1690 GB of instance storage, 64-bit platform
High-CPU Instances
Instances of this family have proportionally more CPU resources than memory (RAM) and are well suited for compute-intensive applications.
$0.20 - High-CPU Medium Instance
1.7 GB of memory, 5 EC2 Compute Units (2 virtual cores with 2.5 EC2 Compute Units each), 350 GB of instance storage, 32-bit platform
$0.80 - High-CPU Extra Large Instance
7 GB of memory, 20 EC2 Compute Units (8 virtual cores with 2.5 EC2 Compute Units each), 1690 GB of instance storage, 64-bit platform
EC2的流量和IP费用
Internet Data Transfer
$0.100 per GB - all data transfer in $0.170 per GB - first 10 TB / month data transfer out
$0.130 per GB - next 40 TB / month data transfer out
$0.110 per GB - next 100 TB / month data transfer out
$0.100 per GB - data transfer out / month over 150 TB
Elastic IP Addresses
No cost for Elastic IP addresses while in use
$0.01 per non-attached Elastic IP address per complete hour
$0.00 per Elastic IP address remap - first 100 remaps / month
$0.10 per Elastic IP address remap - additional remap / month over 100
S3功能
支持最小1byte,最大5G的存储对象;
无存贮对象个数上的限制;
每个存储对象存储在bucket中,且由有个用户定义的唯一性key来标识;
可以对每个存贮对象设置访问权限;
提供了REST和SOAP接口对对象进行读写、删除等操作;
可靠时间为99.9%;
S3中的概念与操作
S3中的概念
Account
注册Amazon,申请开通S3后,得到AccessKey和Secret Key,你必须使用这两个key来调用API。
Bucket:
是S3中存储对象的容器,对应一个S3的域名,Bucket的名字不能重复,每个Bucket中可以存储无限个对象。
Object:
是S3中的基本存储单元,单个Object最大5G,放在Bucket中,由Key唯一标识,有ObjectData和metaData组成,metaData可 以是标准的metadata或者自定义的metadata。
KEY
KEY在Bucket中用来唯一标识一个Object,也就是说使用Bucket+KEY可以在S3中唯一标识一个Object。
S3中的操作
Services --list all buckets.
Bucket --Create,Delete,List,Get/Set Access Control,Get/Set logging status;
Object --Put,Get,Delete,Set/Get Access Control.
S3客户端
S3的API和库
–REST 和SOAP
–JAVA
–C#
–Python
–Ruby
–PHP,PERL...
S3客户端工具
–S3cmd
–S3sync
–FireFox plug-in
–etc...
S3使用实例
class Up2S3
def initialize
puts "start..."
start = Time.now
conf_file = YAML.load_file(File.expand_path('amazon_conf.yml'))
@@s3 = RightAws::S3Interface.new(conf_file['access_key_id'], conf_file['secret_access_key'])
puts "@@s3=#{@@s3.to_s}"
puts "Init Spend Time #{Time.now - start}"
end
#下载测试代码
def self.download(bucket,key)
start = Time.now
foo = File.new(key, "wb+")
@@s3.get(bucket, key) do |chunk|
foo.write(chunk)
end
foo.close
puts "Download Spend Time #{Time.now - start}"
end
#上传测试代码
def self.upload(bucket,key,filename)
start = Time.now
@@s3.put(bucket,key, File.open(filename)) #=> true
puts "Upload Spend Time #{Time.now - start}"
end
end
S3的价格
美国价格:
Storage
$0.15 per GB-Month of storage used Data Transfer
$0.100 per GB - all data transfer in
$0.170 per GB - first 10 TB / month data transfer out
$0.130 per GB - next 40 TB / month data transfer out
$0.110 per GB - next 100 TB / month data transfer out
$0.100 per GB - data transfer out / month over 150 TB
Requests
$0.01 per 1,000 PUT, POST, or LIST requests
$0.01 per 10,000 GET and all other requests*
Amazon AWS影响力
Top 10 Enterprises in the Cloud
The NY Times
Amazon EC2
Nasdaq
Amazon S3
Major League Baseball
Joyent
ESPN
Rightscale using Amazon EC2
Hasbro
Amazon EC2
British Telecom
3Tera
Taylor Woodrow
Google Apps
CSS
Amazon EC2
Activision
Amazon EC2
Business Objects (A SAP Company)
Rightscale using Amazon EC2
Amazon AWS侧面影响力
AWS对CDN的威胁。While Amazon's AWS service is not a fit for most of those who use a CDN today, it is interesting to see how some of the CDNs are using Amazon's service to their advantage. Digital Fountain is building their streaming only, U.S. based CDN on Amazon Web Services and other CDNs like Voxel.net have direct integration with Amazon's S3 API.
AWS2007年经产生约1亿美元的收入
AWS流量需求2007年已经大于Amazon全球网站流量需求。As an indicator of adoption, bandwidth utilized by these services in fourth quarter 2007 was even greater than bandwidth utilized in the same period by all of Amazon.com’s global websites combined.
AWS吸引大量开发人员关注。Over 400,000 developers have registered to use Amazon Web Services (AWS), up more than 30,000 from last quarter.
除AWS四大服务外,正在部署按需视频服务。Amazon.com introduced a limited beta version of Amazon Video On Demand. The service lets customers rent or buy ad-free movies and television shows and watch them instantly within their web browser on Macs or PCs and through Sony BRAVIA television sets with the use of the Sony BRAVIA Internet Video Link.
Amazon AWS潜力
AWS已经在个人和初创企业获得口碑和大量使用者,并开始浸入企业级市场
In a recent TechCrunch article about Amazon Web Services, it‘s revealed that “the biggest customers in both number and amount of computing resources consumed are divisions of banks, pharmaceuticals companies and other large corporations who try AWS once, for a temporary project, and then get hooked.
Photo sharing service SmugMug received much attention when they published their business case for using Amazon's S3 storage service. The 2002 startup describes how the service saves them between half a million to a million USD per year.
It's too small. I had the same trouble with EC2 until recently. Normal accounts are limited to 18 VCPUs. I have 1500 now after talking ith them so I'm hoping to use it more. We do testing on internal farms of boxes much bigger than the proposed HP cloud, I think it's about 120 boxes figuring 8 cores per box minimum.