Special Issue "Server Technologies in Cloud Computing and Big Data"

A special issue of Future Internet (ISSN 1999-5903).

Deadline for manuscript submissions: closed (15 March 2013)

Special Issue Editor

Guest Editor
Prof. Dr. Yike Guo

Department of Computing, Imperial College London, 180 Queen's Gate, London SW7 2BZ, UK
Website | E-Mail
Interests: cloud; big data; informatics; e-science

Special Issue Information

Dear Colleagues,

“Big data is like crude oil — it needs filtering and refining to unlock its value and make it usable.” Holger Kiske (2012)
Big data usually includes data sets with sizes beyond the ability of commonly-used software tools to capture, manage, and process the data within a tolerable elapsed time. It also endure the diversity and nature of fast evolving. Big data sizes are a constantly moving target, ranging from a few dozen terabytes to many petabytes of data in a single data set. Difficulties include capture, storage, search, sharing, analysis, and visualization. The requirement of a spectrum of advanced technologies, skills, and investments, including the demands of huge amounts of external data and a lot of data services, is the reason that cloud becomes a sensible platform for big data applications.  The ability of cloud computing to provide computer resources, storage, network capacity and analysis big data made the two fields are closely connected.
The special issue of Server Technologies in Big Data and Cloud Computing intends to tackle the big data issue intelligently, efficiently and effectively with the cloud technology. It will also emphasize on advances in the emerging technical and solutions on big data and cloud computing.

Yike Guo
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 850 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.


  • Big Data applications in Cloud
  • Big Data analytics in Cloud
  • Cloud data management solutions
  • Big Data sharing in Cloud
  • DaaS, Data as a service
  • Opportunistic data processing in hybrid clouds
  • Dynamic provisioning for big data processing
  • System Issues related to large datasets
  • Big Data placement, scheduling, and optimization
  • Issues of privacy, security and trust in Big Data analytics
  • Fragmentation issues of Big Data
  • Distributed file systems for Big Data
  • Scalability issues in data storage systems
  • Future research challenges in management of Big Data

Published Papers (1 paper)

View options order results:
result details:
Displaying articles 1-1
Export citation of selected articles as:


Open AccessArticle Libraries’ Role in Curating and Exposing Big Data
Future Internet 2013, 5(3), 429-438; https://doi.org/10.3390/fi5030429
Received: 21 March 2013 / Revised: 16 May 2013 / Accepted: 12 July 2013 / Published: 20 August 2013
Cited by 8 | PDF Full-text (267 KB) | HTML Full-text | XML Full-text
This article examines how one data hub is working to become a relevant and useful source in the Web of big data and cloud computing. The focus is on OCLC’s WorldCat database of global library holdings and includes work by other library organizations
[...] Read more.
This article examines how one data hub is working to become a relevant and useful source in the Web of big data and cloud computing. The focus is on OCLC’s WorldCat database of global library holdings and includes work by other library organizations to expose their data using big data concepts and standards. Explanation is given of how OCLC has begun work on the knowledge graph for this data and its active involvement with Schema.org in working to make this data useful throughout the Web. Full article
(This article belongs to the Special Issue Server Technologies in Cloud Computing and Big Data)

Figure 1

Back to Top