Wednesday, 26 April 2017

#Mobile #assessment based on self-determination theory of motivation #educon17

Talk given at Educon in Athens, Greece by Stavros Nikou, really interesting mobile learning addition in the area of vocational and learning assessment. Mobile devices in assessment: offer and support new learning pedagogies and new ways of assessment: collaborative and personalised assessments.

Motivation of the framework is aiming to address: following the self-determination theory (http://selfdeterminationtheory.org/theory/) : intrinsic and extrinsic motivation. Intrinsic motivation works from insight of the person, and because it is enjoyable. Extrinsic is build upon reward or punishment. What they try to do is get more intrinsic motivation ignited, as it leads to better understanding and better performance.

There are 3 elements in the theory: autonomy, competence, relatedness all of this impacts the self-determination. This study try to use these three elements to increase intrinsic motivation.
Mobile-based assessment motivational framework: the framework is still in a preliminary phase, but of interest. Autonomy: personalised and adaptive guidance, grouping questions into different difficulty levels (adaptive to learner), location specific – context-aware.
Competence: provide emotional and cognitive feedback that is immediate. Drive students to engage in authentic learning activities, appropriate guidance to support learners.
Preliminary evaluation of the proposed framework: paper based and mobile based assessments used prior and after intervention to test out the framework. Using an experimental design, assessments after each week of formal training, two assessments in total for both groups. ANCOVA data analysis used.

Results: significant difference of autonomy, and competence, and relatedness. The framework will be expanded with additional mobile learning features, and framework will be used with different students. Future research wants to enhance the framework.
The mobile assessment had a social media collaborative element in it, and it also made use of more feedback options due to the technical possibilities that the mLearning option had.


Using #learningAnalytics to inform research and practice #educon17

Talk during Educon2017 by Dragan Gasevic known for his award-winning work of his team on the LOCO-Analytics software is considered one of the pioneering contributions in the growing area of learning analytics. In 2014 he founded ProSolo Technologies Inc (https://www.youtube.com/watch?v=4ACNKw7A_04) that develops a software solution for tracking, evaluating, and recognizing competences gained through self-directed learning and social interactions.

He jumps up the stage with a bouncy step and was in good form to get his talk going.

What he understands under learning analytics is the following: shaping the context of learning analytics results in challenges and opportunities. Developing a lifelong learning journey automatically results in a measuring system that can support and guide the learning experience for individuals.
Active learning also means constant funding, to enable the constant iteration of knowledge, research and tech. But even if you provide new information, there are only limited means to understand who in the room is actually learning something, or not. So addressing the need to get meaningful feedback on what is learned is the basis of learning analytics.
Learning system (e.g. LMS), we also use socio-economic details of individuals
No matter which technologies are used, the interaction with these technologies results in digital footprints. Initially the technologists used the digital footprints as a means to adjust the technology. But gradually natural language processing, learning, meaning creation… also became investigated using these technologies.

Actual applications of learning analytics are given: two well known examples
Course Signals from the Purdue university: analysing the student actions within their LMS (blackboard), different student variables, outcome variables for student risk (high, mediat, low risk) provided by algorithms using the data from the digital footprints of each students. The teachers and students got ‘traffic light’ alerts. Those students they used the signals, had an increase of 10 to 20 percent student success.
Doing a content analysis of using course signals, summative feedback seemed to have much less related to student success, but formative (detailed specific) feedback did have immediate effect on learning success.
University of Michigan E2Coach (top 2 public universities in US). They have large science classrooms, but populated by students with very varied science grade background.
In the E2coach project, they used the idea of ‘better than expected’, so they looked at successful learning patterns: successful students would be adaptive (trying different options to learn), and those who self-organised in peer groups, to enable content structuring.
Top performing students were asked to give pointers on what they did to be successful learners. Those pointers were given to new students to provide them feedback on how they could increase learning success, but at the same time giving them the option to learn (self-determination theory). This resulted in about 5 percent improvement of learner success.

Challenges of learning analytics
Four challenges:
Generalisability: while we are seeing predictive models for student success, but they only extend to what can be generalised. Significance of these models were not too applicable across different context, so the generalisability was quite low. Some indicators seem to be significant predictors, yet in other contexts they are not. So what is the reason behind this. This means we are now collecting massive amounts of MOOC data to look for specific reasons. But this work is difficult, as we need to understand what questions do we need to address.
Student agency: also a challenge. How much of student decisions are made by themselves, but the responsibility of learning is in their hands in their hands.
Common myth in learning analytics: more time spend on tasks, the more they will learn. Actually, this is not the case, more reverse actually. Even time with educators is frequently showing that it is an indicator for poor learner success.
Feedback presentation: we felt that the only way to give feedback is visualisation and dashboards. But many different type of vendors involved in learning analytics look into dashboards. But they sometimes these dashboards are harmful, as the students compared with the class performance, resulting in less student engagement and learning. Students sometimes invested less time as they felt from the dashboards they were doing well, so with less investment less learning.
Investment and willingness to understand: http://he-analytics.com and the SheilaProject  http://sheilaproject.eu/ 50plus senior leaders investigated for their understanding of learning analytics. Institutions hardly provide opportunities to learn what learning analtytics are really about. Lack of leadership on learning analytics, so in many cases they are not sure what it entails, or what to do with it. So that results in buying a product… which does not make sense.
Lack of active engagement of all the stakeholders: students are mostly not involved from day one in development of these learner analytics (no user-centered approach).

Direction for learning analytics
Learning analytics are about learning. So we need to fall back on what we already know about learning, then design certain types of intervention using learning analytics. Learning analytics is more than data science, it provides powerful algorithms, machine learning algorithms, system dynamics… but we are end up into a data crunching problem, as we need Theory (particular approaches: cognitive load, self-regulation), practices also inform where to go. We need to take into account whether these results make sense. Which of the correlations are really meaningful, which make sense… but at the same time e need to take into account learning design and the way we are constructing the learning paths for our students. We cannot ignore experimental design, if we also are using meaningful learning analytics. We need to be very specific about study design. Interaction design is for types of interfaces, but they need to be aligned with pedagogical methods.

How does this result in the challenges mentioned before
Generalisability: if we want to make sure we think about this, we need to take into account that one fits all will never work in learning. Different mission, different population, different models, different legislation.. level of individual courses. Differences in instructional design, different courses need different approaches. It is all about contextual information. So what shapes our engagement? Social networks work only for those called weak ties. Networks with only strong ties restrain full learning success. Data mining can help us to analyse networks: exponential random graphs (not sure here?). Machine learning transfer: using it across different domains. Recent good developments addressing this.

Student agency: back to established knowledge (2006 paper: students use operations and to create artefacts for recall or trying to provide arguments or critical thinking). The student decisions are based on student conditions: prior knowledge, study skills, motivations… all of these conditions need to be taken into account. Identifying sub-groups of learners based on algorithms. Some students are really active, but not productive. Some students were only performing were only mediocre active, yet very good performing in terms of studying. Study skills are changing, priorities are changing during learning… so this means different learning agency. Desirable difficulties need to be addressed and investigated. So no significant success between the highly active and mediocre active students, which needs to be studied to find reasons behind it. Learners motivation changes the most during the day as can be seen from literature. So we need to focus to understand these reasons, and to set up interdisciplinary teams to highlight possible reasons while strongly grounding it in existing theory.

Analytics-based Feedback: students need guidance, not only task specific language indicators. This can be done by semi-automatic teacher triggers to provide more support and guidance, resulting in meaningful feedback used by students. (look up research from Sydney, ask reference Inge). Personalised feedback have a significant effect (Inge, again seen in mobimooc). http://ontasklearning.org
Shall we drop the study of visualization? No, it is significant for study skills, and decision making on analytics, but we need to focus on which methods work and to gradually involve visualisations to know what works, what not. Taylor it to specific tasks, design it in a different way than up till now.

Development of analytics capacity and culture: ethics, privacy concerns, very few faculty really ask students for feedback. What are the key points for developing culture: think about discourse (not only technical specs). We need to understand data, we need to work with IT, using different type of models not only data crunching, start from what we know already, finally transformation: we need to step away from learning analytics as technology, we need to talk to our stakeholders to know how to act, how to do it, who is responsible for certain things, … the process can be inclusive adoption process (look it up). We need to think about questions, design strategies, working on the whole phenomenon we need to involve the students. Students are highly aware of the usefulness of their data.

If we want to be successful with learning analytics we need to work together if we want to make a significant difference or impact.

Tuesday, 25 April 2017

Liveblog from university to continued professional learning #educon17 #lifelonglearning

Frank Gielen talks on innovation adoption and transformation. This talk is part of the pre-conference talks of the Educon conferencein Athens, Greece. The talk looks at how to organise learning (master, professional, phd…) to gradually move towards lifelong learning.

Skills gap between what the companies want and which human resources and innovations are available.
The human capital is missing frequently, which means that education is increasingly important.
If you want to transform your ‘old’ energy approach to sustainable or renewable energy approach(es).
So education is core in the innovation process, as you need to train all stakeholders (senior management, workforce on the floor, mid-management…).
Education linked to innovation has two main factors impacting it: speed of adoption (graduates need to be skilled), timeliness (skills need to be used within 2 months at least).
So innovation speed is equivalent with training need. Learning needs to be adapted to speed of innovation.

Personalising learning
Starting from the knowledge triangle: education, industry and research as a baseline for higher education goals which needs to be combined in order to create an employable highly trained workforce coming out of higher ed.
What learning trends are important to stay competitive in the market: Continued Professional Development, become power learners. This means that the human factor needs to be continually developing, in order to be on top of a high turn-around field.
No one size fits all, so in education this means personalised learning, the role of the teacher changes that instead of having a lot of lectures, having online resources which students are knowledgeable to use to create a constant base-line, adding mentoring e.g. the Socratic approach where the teachers are in close contact and support learners.
Solving a challenge also includes having an effect on society.

Merging masters with professional learning
Contemporary learning consists on average of: 70 percent informal learning, 20 percent social learning, 10 percent formal training.
MicroMasters (short online format 10-15 ECTS and commonly project based), in many cases complimentary for the campus teaching, but enabling a blended master. This is something we need to consider as InnoEnergy. But in many cases microMasters are linked to deepening learning in a specific field, and is frequently based on a general foundation (so need for clear learning paths). 
We are shifting towards lifelong learning, blurring the boundaries between master schools, doctoral schools and professional schools.
Learning architecture: MOOCs or microMaster, certified microMaster, blended microMaster with coaching, blended-in-house microMaster with coaching and Bring Your Own Program (BYOP).
A new learning paradigm: personalised, just-in-time learning.
Education is going through a digital transformation. This means that more data is available, which we can start using as a means to support learning. Based on this personalised learning will become available, and lacking skill sets can be found. Data driven education comes a bit closer to enabling personalised learning.
Feedback and coaching has the highest learning impact. This means that teachers need to be prepared to become a guide-on-the-side or a good coach.

Learning entrepreneurship
Learners need to learn it. But not all of the students need to be entrepreneurs, but all of the students need to understand an entrepreneurial skill set: see opportunities, motivate people, drive change, find scarce resources, deal with the uncertainty of innovation. But … then how we measure this, and assess it?

This means being an early adopter, and being a catalyst for educational innovation.

Friday, 21 April 2017

Novel initiative Teach Out: Fake news detecting #criticalthinking #mooc

If you have just a bit of time this week, and you are interested in new ways of online teaching as well as critical thinking... this is a fabulous initiative. The “Fake news, facts and alternative facts” MOOC is part of a teach out course (brief yet meaningful just-in-time learning initiative focused on a hot topic).

Course starts on 21 April 2017 (today)
Course given by the University of Michigan, USA

This is not just a MOOC, actually, it being a MOOC is the boring part. What is really interesting is the philosophy behind the teach out, and the history behind the teach out events. This feels a bit more like an activist driven teaching, admittedly here with a renowned institute.



Brief course description
Learn how to distinguish between credible news sources and identify information biases to become a critical consumer of information.
How can you distinguish between credible information and “fake news?” Reliable information is at the heart of what makes an effective democracy, yet many people find it harder to differentiate trustworthy journalism from propaganda. Increasingly, inaccurate information is shared on social networks and amplified by a growing number of explicitly partisan news outlets. This Teach-Out will examine the processes that generate both accurate and inaccurate news stories and the factors that lead people to believe those stories. 

Participants will gain skills help them to distinguish fact from fiction.

This course is part of a Teach-Out, which is:
·        an event – it takes place over a fixed, short period of time
·        an opportunity – it is open for free participation to everyone around the world
·        a community – it will be joined by a large number of diverse individuals
·        a conversation – an opportunity to give and take ideas and information from people

The University of Michigan Teach-Out Series provides just-in-time community learning events for participants around the world to come together in conversation with the U-M campus community, including faculty experts. The U-M Teach-Out Series is part of our deep commitment to engage the public in exploring and understanding the problems, events, and phenomena most important to society.

Teach-Outs are short learning experiences, each focused on a specific current issue. Attendees will come together over a few days not only to learn about a subject or event but also to gain skills. Teach-Outs are open to the world and are designed to bring together individuals with wide-ranging perspectives in respectful and deep conversation. These events are an opportunity for diverse learners and a multitude of experts to come together to ask questions of one another and explore new solutions to the pressing concerns of our global community. Come, join the conversation!

(Picture: http://maui.hawaii.edu/hooulu/2017/01/07/the-real-consequences-of-fake-news/ )

Companies should attract more Instructional Designers for training #InstructionalDesign #elearning

Online learning is increasingly pushing university learning and professional training into new directions. This means common ground must be set on what online learning is, which approaches are considered as best practices and which factors need to be taken into account to ensure a positive company wide uptake of the training. Although online learning has been around for decades, building steadily on previous evidence-based best practices, it is still quite a challenge to organize online learning across multiple partners, let alone across cultures (in the wide variety of definitions that culture can have).

Earlier this month Lionbridge came out with a white paper entitled “steps for globalizing your eLearning program”. It is a 22 page free eBook, and a way to get your contact data. The report is more corporate than academically inclined (subtitle is ‘save time, money and get better results’), and offers an insight look of how companies see global elearning and which steps to take first. But when reading the report - which does provide useful points - I do feel that corporate learning needs to accept that instructional design expertise is necessary (the experts! the people!) and needs to be attracted by the company, just like top salespeople, marketing, HR … for it is a real profession and it demands more than the capacity to record a movie and put it on YouTube!

In their first step they mention: Creating a globalizing plan
  • Creating business criteria
  • Decide on content types
  • Get cultural input
  • Choose adaptation approach

The report sets global ready content as a baseline: this section mentions content that is culturally neutral. Personally, I do not belief cultural neutrality is possible, therefor I would suggest using a cultural, balanced mix, e.g. mixing cultural depictions or languages, even Englishes (admitting there is more then one type of English and they are all good). But on the bonus side, the report also stresses the importance of using cultural native instructional design (yes!), which I think can be learner-driven content to allow local context to come into the global learning approach. Admittedly, this might result in more time or more cost (depending on who provides that local content), but it also brings the subject matter closer to the learner, which means it brings it closer to the Zone of Proximal Development (Vygotsky) or enables the learner to create personal learning Flow (Csikszentmihalyi) or simply to allow the learner to think ‘this is something of interest to me, and I can learn this easily’.

In a following step: Plan ahead for globalisation
  • Legal issues: looking at IPR or the actual learning that can be produced. 
  • Technology and infrastructure: infrastructure differs. 
  • Assessment and feedback mechanisms: (yes!) Feedback, very important for all involved
  • Selecting a globalizing partner

The report is brief, so not too much detail is given on what is meant with the different sections, but what I did miss here was the addition of peers for providing feedback, or peer actions to create assessments that are actually contextualized and open to cultural approaches. No mention of the instructional design experts in this section either.
In the third section a quick overview is given on what to take into account while creating global elearning content, again the focus is on elements and tools: using non-offensive graphics, avoiding culturally heavy analogies, neutral graphics…, not on the actual instruction, which admittedly would take up more than 22 pages, but the instructional approach is to me the source of learning possibilities.

Promoting diverse pedagogy
The final part of the report looks at the team you need, but …. Still no mention of the instructional design expert (okay, it is a fairly new title, but still!). And no mention of the diversity in pedagogy that could support cultural learning (not every culture is in favor of Socratic approaches, and not every cultural group likes classic lecturing).

Attract instructional designers
While the report makes some brief points of interest, I do feel that it lacks what most reports on training are lacking, they seem to forget that online instruction is a real job, a real profession with real skills and which does take years to become good at, just like any STEM or business oriented job. This does indicate that corporations are acknowledging an interest in online training (and possible profit), but … they still think that it can be built easily and does not require specific expertise.
There is no way around it: if you want quality, you need to attract and use experts. If you want to build high quality online training that will be followed and absorbed by the learner, interactions, knowledge enhancement, neurobiological effects… all of this will matter and needs to be taken into account (or at least one needs to be aware of it).
Now more than ever, you cannot simply ‘produce a video’ and hope people will come. There are too many videos out there, and a video is a media document, not necessarily a learning element. Learning is about thinking about the outcome you want to have, and then work backwards, breaking the learning process down into meaningful steps. Why do you use a video? Why do you use a MCQ? Does this really result in learning, or simply checking boxes and consuming visual media?

Building common ground as a first global elearning step
Somehow I feel that the first step should include overall acceptance of a cooperatively build basis:
What are our quality indicators (media quality, content quality, reusability, entrepreneurial effect of the learning elements, address global diversity in depicting actors (visual and audio), …)

Which online learning basics does everyone in the company (and involved in training) need to know: sharing just-in-time learning (e.g. encountering a new challenge: take notes of challenge and solution), sharing best practices on the job (ideal for mobile options), flipped lectures for training moments (e.g. case study before training hours, role play during workshops…), best practices for audio recordings … these learning basics can be so many things, depending on the training that needs to be created, but it needs to be set up collaboratively. If stakeholders feel they will benefit from training, and they are involved in setting up some ground rules and best practices, they are involved. It all comes down to: which type of learning is needed, what does this mean in terms of pedagogical options available and known, and what do the learners need and use.