標題: File Deduplication with Cloud Storage File System
作者: Ku, Chan-I
Luo, Guo-Heng
Chang, Che-Pin
Yuan, Shyan-Ming
資訊工程學系
Department of Computer Science
關鍵字: HDFS;Data Deduplication;Cloud Computing;Single instance storage
公開日期: 1-Jan-2013
摘要: The Hadoop Distributed File System (HDFS) is used to solve the storage problem of huge data, but does not provide a handling mechanism of duplicate files. In this study, the middle layer file system in the HBASE virtual architecture is used to do File Deduplicate in HDFS, with two architectures proposed according to different requires of the applied requirement reliability, therein one is RFD-HDFS (Reliable File Deduplicated HDFS) which is not permitted to have any errors and the other is FD-HDFS (File Deduplicated HDFS) which can tolerate very few errors. In addition to the advantage of the space complexity, the marginal benefits from it are explored. Assuming a popular video is uploaded to HDFS by one million users, through the Hadoop replication, they are divided into three million files to store, that is a practice wasting disk space very much and only by the cloud to remove repeats for effectively loading. By that, only three file spaces are taken up, namely the 100% utility of removing duplicate files reaches. The experimental architecture is a cloud based documentation system, like the version of EndNote Cloud, to simulate the cluster effect of massive database when the researcher synchronized the data with cloud storage.
URI: http://dx.doi.org/10.1109/CSE.2013.52
http://hdl.handle.net/11536/125062
ISSN: 1949-0828
DOI: 10.1109/CSE.2013.52
期刊: 2013 IEEE 16TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE 2013)
起始頁: 280
結束頁: 287
Appears in Collections:Conferences Paper