Future Internet2016, 8(2), 26; doi:10.3390/fi8020026 - published 11 June 2016 Show/Hide Abstract
Abstract: Since the turn of the 21st century, we have seen a surge of studies on the state of U.S. education addressing issues such as cost, graduation rates, retention, achievement, engagement, and curricular outcomes. There is an expectation that graduates should be able to enter the workplace equipped to take on complex and “messy” or ill-structured problems as part of their professional and everyday life. In the context of online learning, we have identified two key issues that are elusive (hard to capture and make visible): learning with ill-structured problems and the interaction of social and individual learning. We believe that the intersection between learning and analytics has the potential, in the long-term, to minimize the elusiveness of deep learning. A proposed analytics model is described in this article that is meant to capture and also support further development of a learner’s reflective sensemaking.
Future Internet2016, 8(2), 25; doi:10.3390/fi8020025 - published 3 June 2016 Show/Hide Abstract
Abstract: The aim of the study was deepening the knowledge of livestock innovations knowledge on small-scale farms in developing countries. First, we developed a methodology focused on identifying potential appropriate livestock innovations for smallholders and grouped them in innovation areas, defined as a set of well-organized practices with a business purpose. Finally, a process management program (PMP) was evaluated according to the livestock innovation level and viability of the small-scale farms. Logistic regression was used to evaluate the impact of PMP on the economic viability of the farm. Information from 1650 small-scale livestock farms in Mexico was collected and the innovations were grouped in five innovation areas: A1. Management, A2. Feeding, A3. Genetic, A4. Reproduction and A5. Animal Health. The resulting innovation level in the system was low at 45.7% and heterogeneous among areas. This study shows the usefulness of the methodology described and confirms that implementing a PMP allows improving the viability an additional 21%, due to a better integration of processes, resulting in more efficient management.
Future Internet2016, 8(2), 24; doi:10.3390/fi8020024 - published 1 June 2016 Show/Hide Abstract
Abstract: Emerging technologies such as Software-Defined Networks (SDN) and Network Function Virtualization (NFV) promise to address cost reduction and flexibility in network operation while enabling innovative network service delivery models. However, operational network service delivery solutions still need to be developed that actually exploit these technologies, especially at the multi-provider level. Indeed, the implementation of network functions as software running over a virtualized infrastructure and provisioned on a service basis let one envisage an ecosystem of network services that are dynamically and flexibly assembled by orchestrating Virtual Network Functions even across different provider domains, thereby coping with changeable user and service requirements and context conditions. In this paper we propose an approach that adopts Service-Oriented Architecture (SOA) technology-agnostic architectural guidelines in the design of a solution for orchestrating and dynamically chaining Virtual Network Functions. We discuss how SOA, NFV, and SDN may complement each other in realizing dynamic network function chaining through service composition specification, service selection, service delivery, and placement tasks. Then, we describe the architecture of a SOA-inspired NFV orchestrator, which leverages SDN-based network control capabilities to address an effective delivery of elastic chains of Virtual Network Functions. Preliminary results of prototype implementation and testing activities are also presented. The benefits for Network Service Providers are also described that derive from the adaptive network service provisioning in a multi-provider environment through the orchestration of computing and networking services to provide end users with an enhanced service experience.
Future Internet2016, 8(2), 23; doi:10.3390/fi8020023 - published 20 May 2016 Show/Hide Abstract
Abstract: A smart city is an environment where a pervasive, multi-service network is employed to provide citizens improved living conditions as well as better public safety and security. Advanced communication technologies are essential to achieve this goal. In particular, an efficient and reliable communication network plays a crucial role in providing continue, ubiquitous, and reliable interconnections among users, smart devices, and applications. As a consequence, wireless networking appears as the principal enabling communication technology despite the necessity to face severe challenges to satisfy the needs arising from a smart environment, such as explosive data volume, heterogeneous data traffic, and support of quality of service constraints. An interesting approach for meeting the growing data demand due to smart city applications is to adopt suitable methodologies to improve the usage of all potential spectrum resources. Towards this goal, a very promising solution is represented by the Cognitive Radio technology that enables context-aware capability in order to pursue an efficient use of the available communication resources according to the surrounding environment conditions. In this paper we provide a review of the characteristics, challenges, and solutions of a smart city communication architecture, based on the Cognitive Radio technology, by focusing on two new network paradigms—namely, Heterogeneous Network and Machines-to-Machines communications—that are of special interest to efficiently support smart city applications and services.
Future Internet2016, 8(2), 22; doi:10.3390/fi8020022 - published 18 May 2016 Show/Hide Abstract
Abstract: The concept of competence, which emerged during the reform of computer engineering degrees, has not brought benefits to companies when attempting to select the most suitable candidates for their jobs. This article aims to show some of the research that has been conducted to determine why companies have not found these skills useful and how both can be aligned. Finally, we show the development of an Expert System that will enable companies to select the most suitable candidates for their jobs, considering personal and social skills, along with technical knowledge. This prototype will serve as a basis to align the competencies defined in the curricula with professional requirements, thus allowing a true alignment between degree courses and the needs of professional companies.
Future Internet2016, 8(2), 20; doi:10.3390/fi8020020 - published 17 May 2016 Show/Hide Abstract
Abstract: For many individuals and organizations, cyber-insurance is the most practical and only way of handling a major financial impact of an information security event. However, the cyber-insurance market suffers from the problem of information asymmetry, lack of product diversity, illiquidity, high transaction cost, and so on. On the other hand, in theory, capital market-based financial instruments can provide a risk transfer mechanism with the ability to absorb the adverse impact of an information security event. Thus, this article addresses the limitations in the cyber-(re)insurance markets with a set of capital market-based financial instruments. This article presents a set of information security derivatives, namely options, vanilla options, swap, and futures that can be traded at an information security prediction market. Furthermore, this article demonstrates the usefulness of information security derivatives in a given scenario and presents an evaluation of the same in comparison with cyber-insurance. In our analysis, we found that the information security derivatives can at least be a partial solution to the problems in the cyber-insurance markets. The information security derivatives can be used as an effective tool for information elicitation and aggregation, cyber risk pricing, risk hedging, and strategic decision making for information security risk management.