Future Internet2014, 6(4), 597-611; doi:10.3390/fi6040597 - published 29 September 2014 Show/Hide Abstract
Abstract: Geo-Wiki is a crowdsourcing tool used to derive information, based on satellite imagery, to validate and enhance global land cover. Around 5000 users are registered, who contribute to different campaigns to collect data across various domains (e.g., agriculture, biomass, human impact, etc.). However, seeing the Earth’s surface from above does not provide all of the necessary information for understanding what is happening on the ground. Instead, we need to enhance this experience with local knowledge or with additional information, such as geo-located photographs of surface features with annotation. The latest development in enhancing Geo-Wiki in this context has been achieved through collaboration with the University of Waterloo to set up a separate branch called Geography Geo-Wiki for use in undergraduate teaching. We provide the pedagogical objectives for this branch and describe two modules that we have introduced in first and third year Physical Geography classes. The majority of the feedback was positive and in, many cases, was part of what the student liked best about the course. Future plans include the development of additional assignments for the study of environmental processes using Geo-Wiki that would engage students in a manner that is very different from that of conventional teaching.
Future Internet2014, 6(3), 584-596; doi:10.3390/fi6030584 - published 12 September 2014 Show/Hide Abstract
Abstract: In this paper we posit that current investigative techniques—particularly as deployed by law enforcement, are becoming unsuitable for most types of crime investigation. The growth in cybercrime and the complexities of the types of the cybercrime coupled with the limitations in time and resources, both computational and human, in addressing cybercrime put an increasing strain on the ability of digital investigators to apply the processes of digital forensics and digital investigations to obtain timely results. In order to combat the problems, there is a need to enhance the use of the resources available and move beyond the capabilities and constraints of the forensic tools that are in current use. We argue that more intelligent techniques are necessary and should be used proactively. The paper makes the case for the need for such tools and techniques, and investigates and discusses the opportunities afforded by applying principles and procedures of artificial intelligence to digital forensics intelligence and to intelligent forensics and suggests that by applying new techniques to digital investigations there is the opportunity to address the challenges of the larger and more complex domains in which cybercrimes are taking place.
Future Internet2014, 6(3), 556-583; doi:10.3390/fi6030556 - published 8 September 2014 Show/Hide Abstract
Abstract: Based on existing literature, this article makes a case for open (government) data as supporting political efficiency, socio-economic innovation and administrative efficiency, but also finds a lack of measurable impact. It attributes the lack of impact to shortcomings regarding data access (must be efficient) and data usefulness (must be effective). To address these shortcomings, seven key activities that add value to data are identified and are combined into the 7R Data Value Framework, which is an applied methodology for linked data to systematically address both technical and social shortcomings. The 7R Data Value Framework is then applied to the international Fusepool project that develops a set of integrated software components to ease the publishing of open data based on linked data and associated best practices. Real-life applications for the Dutch Parliament and the Libraries of Free University of Berlin are presented, followed by a concluding discussion.
Future Internet2014, 6(3), 542-555; doi:10.3390/fi6030542 - published 28 August 2014 Show/Hide Abstract
Abstract: Societies have rapidly morphed into complex entities that are creating accessibility, yet, at the same time, they are developing new forms of neogeographic-poverty related to information uptake. Those that have managed to partake in the opportunities provided by the web have new vistas to survive in, in contrast to the new poor who have limited or no access to information. New forms of data in spatial format are accessible to all, however few realize the implications of such a transitional change in wellbeing: Whether entire societies or individuals. The different generations taking up the information access can face different levels of accessibility that may be limited by access to online data, knowledge of usage of tools and the understanding of the results, all within the limits on the spaces they are familiar with. This paper reviews a conceptual process underlining the initial steps of a long-term project in the Maltese Islands that seeks to create an online series of tools that bring the concept of “physical place” to the different generations through the management of a major project, the creation of a 3D virtuality, employing scanning processes, GIS, conversion aspects, and a small block-based Minecraft engine.
Future Internet2014, 6(3), 518-541; doi:10.3390/fi6030518 - published 19 August 2014 Show/Hide Abstract
Abstract: The World Wide Web is the largest information repository available today. However, this information is very volatile and Web archiving is essential to preserve it for the future. Existing approaches to Web archiving are based on simple definitions of the scope of Web pages to crawl and are limited to basic interactions with Web servers. The aim of the ARCOMEM project is to overcome these limitations and to provide flexible, adaptive and intelligent content acquisition, relying on social media to create topical Web archives. In this article, we focus on ARCOMEM’s crawling architecture. We introduce the overall architecture and we describe its modules, such as the online analysis module, which computes a priority for the Web pages to be crawled, and the Application-Aware Helper which takes into account the type of Web sites and applications to extract structure from crawled content. We also describe a large-scale distributed crawler that has been developed, as well as the modifications we have implemented to adapt Heritrix, an open source crawler, to the needs of the project. Our experimental results from real crawls show that ARCOMEM’s crawling architecture is effective in acquiring focused information about a topic and leveraging the information from social media.
Future Internet2014, 6(3), 498-517; doi:10.3390/fi6030498 - published 19 August 2014 Show/Hide Abstract
Abstract: Open data initiatives are characterized, in several countries, by a great extension of the number of data sets made available for access by public administrations, constituencies, businesses and other actors, such as journalists, international institutions and academics, to mention a few. However, most of the open data sets rely on selection criteria, based on a technology-driven perspective, rather than a focus on the potential public and social value of data to be published. Several experiences and reports confirm this issue, such as those of the Open Data Census. However, there are also relevant best practices. The goal of this paper is to investigate the different dimensions of a framework suitable to support public administrations, as well as constituencies, in assessing and benchmarking the social value of open data initiatives. The framework is tested on three initiatives, referring to three different countries, Italy, the United Kingdom and Tunisia. The countries have been selected to provide a focus on European and Mediterranean countries, considering also the difference in legal frameworks (civic law vs. common law countries).