Avoid duplicate key overheads for same data in storage
Main Article Content
Abstract
De-duplication is a technique used to weaken the amount of storage needed by service providers. Now a day the most originating challenge is to perform secure de-duplication in cloud storage. Although convergent encryption has been extensively adopted for secure de-duplication, a demanding issue of making convergent encryption practical is to efficiently and reliably managea massive number of convergent keys. We first introduce a baseline approach in which each user holds an autonomous master key forencrypting the convergent keys and outsourcing them to the server. As a proof of concept, encompass the implementation framework of proposed authorized duplicate check scheme and conduct experiments using these prototype. In proposed system involve authorized duplicate checkscheme sustain minimal overhead compared to normal operations.De-duplication is one of important data compression techniques for eliminating duplicate copies of repeating data. For that purpose Authorized duplication check system is used. This paper addresses problem of privacy preserving de-duplication in cloud computing and introduce a new de-duplication system supporting for Differential Authorization, Authorized Duplicate Check, Unfeasiblity of file token/duplicate-check token, In distinguishability of file token/duplicate-check token, Data affinity.In this project we are presenting the certified data de-duplication toprotect the data security by counting differential privileges of users in the duplicate check.Different new de-duplication constructions presented for supporting authorized duplicate check.
Article Details
How to Cite
, R. S. D. S. B. M. S. B. P. R. B. T. P. V. D. B. (2015). Avoid duplicate key overheads for same data in storage. International Journal on Recent and Innovation Trends in Computing and Communication, 3(9), 5519–5523. https://doi.org/10.17762/ijritcc.v3i9.4873
Section
Articles