Future of Everything

Human evolution is inextricably connected to technology’s continued development. Robots (Artificial Intelligence) have become a particularly significant symbol of that relationship. They represent a type of technological ascendency, but, more importantly, they are an anthropomorphized technological instance that force us to examine and understand our respective differences and our intersecting and complementary paths.

The development of digital objects and ecosystems that are highly adaptive and resilient is on the rise — American businessman, investor, and author Mark Cuban has called it “the automation of automation.” Many of the individual technologies or sets of technologies described below have the potential to impact higher education’s structures and operational practices, including curricula and co-curricula intended to prepare students for life and work.

“Gartner’s Hype Cycle for Emerging Technologies, 2016” (Walker, Burton, and Cantara, 2016) identifies three distinct high-priority technology trends:

  • Human-centric
  • Perceptual smart machines
  • Platform revolution

Technology will continue to become more human-centric, introducing transparency across our everyday transactions and our interactions with each other, systems, and smart objects (Internet of Things). This relationship will become increasingly interdependent as technologies become more adaptive, contextual, and fluid. Such technologies include brain-computer interfaces [1], human augmentation [2], affective computing [3], augmented reality [4], virtual reality [5], and gesture-control devices [6].

Due to the proliferation of data and advances in computational power and deep neural networks, organizations with smart machine technologies will be able to harness data in order to adapt to new situations and challenges. Such technologies include machine learning [7], virtual personal assistants [8], cognitive expert advisors [9], smart data discovery [10], smart workspace [11], conversational user interfaces [12], smart robots [13], natural language question answering [14], personal analytics [15], and context brokering [16].

Platform revolution is a third high-level technology trend presented in Gartner’s Hype Cycle. Emerging technologies are changing how platforms are defined and used. The shift from technical infrastructure to “ecosystem enabling” is providing a foundation for new business models that bridges humans and technology. Gartner analysts recommend that organizations shift to platform-based business models and exploit algorithms to generate value. Key technologies in this area include blockchain [17], IoT (Internet of Things) platform [18], software-defined security [19], and software defined anything (SDx) [20].

Data in Education
Big Data and analytics have the potential to enable [higher education] institutions to thoroughly examine their present challenges, identify ways to address them as well as predict possible future outcomes – (Daniel, 2017, p. 19).

The second high level trend identified by Gartner analysts, perceptual smart machines, points to greater reliance on created and collected systems data, as well as advances in computation for data-driven decision-making. In “Big Data in Higher Education: The Big Picture,” Daniel (2017) examines what he calls “a new phenomenon is higher education,” the utilization of Big Data [21]. Daniel points out that higher education has practiced data-driven decision-making in the past, but the practice has been generally unsystematic.

Educational Data Mining (EDM) and learning analytics are interrelated Big Data research areas within higher education. Educational Data Mining is concerned with the development of computational tools for discovering patterns across data, while learning analytics is focused on understanding individual students and how they perform within a given learning environment (Luan, 2002; Romero & Ventura, 2010).

Implementation of Big Data solutions (EDM and learning analytics) in higher education will require proper interpretation and understanding of administrative and operational data. When used in conjunction with other established methods, data has the potential to greatly assist in assessment of performance and progress across institutional divisions and departments (Picciano, 2012; Siemens and Long, 2011; Watters, 2011).

Higher education’s implementation of Big Data and analytics solutions can be linked to a greater need for decision-making to be based on evidence from data rather than solely on intuition or experience. Other factors include increased accountability, necessitating the collection of different forms of data for the purpose of generating reports for internal and external stakeholders (Daniel, 2017).

The various information and communication technologies used by students, faculty, and academic staff create data that could provide useful information. This could include unstructured forms of data associated with social media technologies (images, tweets, videos, audio, and web pages) (Sagiroglu and Sinanc, 2013).

The full implementation of Educational Data Mining (EDM) and learning analytics in higher education is not without its complexity. Some of these include dealing with system-to-system interoperability; data is more often than not in various formats and modeled in different ways. Aggregating various forms of data to extract meaning can be a challenge (Daniel & Butson, 2013).

Discussions on the utilization of Big Data in education often focus on greater transparency and efficiency. Eynon (2013) advises, “…discussions of using data to enhance efficiency, increase transparency, support, [competitiveness], and as a tool to evaluate performance (of schools and teachers) [need to be] tempered with considered academic debate” […], and “focus on using these […] tools to empower, support, and facilitate practice and critical research.”

On the learning analytics side, collection of educational data raises issues on ethics associated with ownership, privacy, security and ethics of use (Jones, 2012; Prinsloo et al., 2015). There also are matters of accountability associated with the use of student data for predictive modeling (Daniel, 2017). Eynon (2013) points out that there are important social implications to student prediction models. The more we know about students and their challenges, the greater care we need to take avoiding foregone conclusions about failure or success.

Based on the reviewed research, there is general agreement with Daniel (2017): “reliance on data-driven decision-making will become a central approach in many research- and teaching-intensive institutions in higher education. Big Data and analytics are more likely to effectively transform the way higher education operates and govern[s] itself through the use of various technologies [that] capture, process, analyze, present and use data to generate actionable insights to drive their decisions” (p. 25).

References

Daniel, Ben K. “Big data in higher education: The big picture.” In Big Data and Learning Analytics in Higher Education, pp. 19-28. Springer International Publishing, 2017.

Daniel, Ben K. and Russell Butson. (2013). “Technology Enhanced Analytics (TEA) in Higher Education Daniel Burton.” Proceedings of the International Conference on Educational Technologies (pp. 89-96.).

Eynon, Rebecca. “The rise of Big Data: What does it mean for education, technology, and media research?” Learning, Media and Technology, (2013). 38(3), 237-240.

Jones, Stephanie J. “Technology review: the possibilities of learning analytics to improve learner-centered decision-making.” The Community College Enterprise 18, no. 1 (2012): 89.

Laney, Douglas. 3D data management: Controlling data volume, velocity and variety (Gartner Report). (2001). Available at https://blogs.gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety.pdf

Luan, Jing. “Data mining and its applications in higher education.” In Andreea M. Serban and Jing Luan (Eds.). Knowledge management: Building a competitive advantage in higher education (pp. 17-36). San Francisco, CA: Josey-Bass.

Picciano, Anthony G. “The evolution of Big Data and learning analytics in American higher education.” Journal of Asynchronous Learning Networks16, no. 3 (2012): 9-20.

Prinsloo, Paul, Elizabeth Archer, Glen Barnes, Yuraisha Chetty, and Dion Van Zyl. “Big (ger) data as better data in open distance learning.” The International Review of Research in Open and Distributed Learning 16, no. 1 (2015).

Romero, C. R., & Ventura, S. (2010). “Educational data mining: A review of the state of the art.” IEEE Transactions on System, Man, and Cybernetics Part C: Applications and Reviews, 40(6), 601-618.

Sagiroglu, Seref, and Duygu Sinanc. “Big Data: A review.” In Collaboration Technologies and Systems (CTS), 2013 International Conference on, pp. 42-47. IEEE, 2013.

Siemens, George, and Phil Long. “Penetrating the fog: Analytics in learning and education.” EDUCAUSE review 46, no. 5 (2011): 30.

Walker, Mike J., Burton, Betsy, and Michele Cantara. “Hype cycle for emerging technologies, 2016.” Gartner, July (2016).

Watters, Audrey. “How data and analytics can improve education: George Siemens on the applications and challenges of education data.” O’Reilly Radar (2011). Available at https://www.oreilly.com/ideas/education-data-analytics-learning

Notes

(Note definitions i – xvi are taken from “Gartner’s Hype Cycle for Emerging Technologies, 2016.”)

[1] A brain-computer interface is a type of user interface, whereby a computer interprets the user’s distinct brain patterns.

[2] Human augmentation creates cognitive and physical improvements as an integral part of the human body to deliver performance that exceeds normal human limits.

[3] Affective computing technologies sense the emotional state of a user and respond by performing specific, predefined product and service features.

[4] Augmented reality (AR) is the real-time use of information in the form of text, graphics, audio, and other virtual enhancements integrated with real-world objects and presented using a head-mounted-type display or projected graphics overlays. It is this “real world” element that differentiates AR from virtual reality.

[5] Virtual reality (VR) provides a computer-generated 3D environment that surrounds a user and responds to an individual’s actions in a natural way, usually through immersive head-mounted displays (HMDs).

[6] Gesture control devices are worn or held by the user in order to capture body movements, gestures and expressions. Devices and software applications can interpret gestures with specific semantic content as a means to enhance the human-machine interface (HMI).

[7] Machine learning is a technical discipline that aims to extract certain kinds of knowledge/patterns from a series of observations.

[8] A virtual personal assistant (VPA) performs some of the functions of a human assistant. With a user’s permission, it: observes user content and behavior, may predict users’ needs, may act autonomously on the user’s behalf.

[9] Cognitive expert advisors (CEAs) possess a specialized algorithm, as well as machine-learning and natural-language processing functions tuned specifically to a purpose-built, curated body of big data to generate insights, discoveries, recommendations and decisions.

[10] Smart data discovery is a next-generation data discovery capability that enables business users and citizen data scientists to automatically find, visualize and narrate relevant findings, such as correlations, exceptions, clusters, links and predictions, without having to build models or write algorithms.

[11] A smart workspace leverages the growing digitalization of physical objects brought about by the Internet of Things (IoT) to deliver new ways of working, sharing information and collaborating.

[12] Conversational UI (CUI) is a high-level design model in which user and machine interactions primarily occur in the user’s spoken or written natural language.

[13] Smart robots are smart machines with an electromechanical form factor that work autonomously in the physical world, learning in short-term intervals from human-supervised training and demonstrations or by their supervised experiences on the job. They sense conditions in local environments and recognize and solve problems.

[14] Natural-language question answering (NLQA) technology is a type of natural-language processing technology, composed of applications that provide users with a means of asking a question in plain language.

[15] Personal analytics is the use of data by an individual to help achieve objectives across a range of domains.

[16] Context brokering is a data-processing model aimed at discovery and analysis of the context of data to understand the derived states of entities (such as people, things or places) relevant to human or automated decision-making.

[17] Blockchain is a type of distributed ledger in which value exchange transactions (in bitcoin or other token) are sequentially grouped into blocks. Each block is chained to the previous block and immutably recorded across a peer-to-peer network, using cryptographic trust and assurance mechanisms.

[18] An Internet of Things (IoT) platform is software that facilitates operations involving IoT endpoints (sensors, devices, multidevice systems and systems of systems) and enterprise resources.

[19] Software-defined security (SDSec) is an umbrella term covering a number of security technologies that benefit when the security policy management is abstracted from the underlying security policy enforcement points.

[20] Software-defined anything (SDx) is a collective term that encapsulates the market momentum for improved standards for infrastructure programmability and data center interoperability that are driven by automation inherent to cloud computing, DevOps and fast bimodal infrastructure provisioning.

[21] Big Data is regarded as term that describes an incredible growth in volume, structure and speed in which data is being generated (Daniel, 2017, p. 21). Laney (2001) summarizes what constitutes Big Data as the “three Vs” (volume, velocity, and variety).