SlideShare a Scribd company logo
USING LARGE-SCALE LMS
DATA PORTAL DATA
TO IMPROVE TEACHING AND LEARNING
(AT K-STATE)
INTERNATIONAL SYMPOSIUM ON INNOVATIVE TEACHING AND LEARNING AND ITS
APPLICATION TO DIFFERENT DISCIPLINES
DIGITAL POSTER SESSION
SEPT. 26 – 27, 2017
KANSAS STATE UNIVERSITY – TEACHING & LEARNING CENTER
SESSION DESCRIPTION
• With any learning management system, a byproduct of its function is data, which may be
analyzed to improve awareness, decision-making, and actions. At Kansas State University, its
Canvas LMS instance recently made available its cumulative data from its first use in 2013.
These flat files open a window to how the university is harnessing its LMS, with some macro-
level insights that may suggest some areas to improve teaching and learning. This session
describes some approaches to informatizing this empirical “big data” with some basic
approaches: reviewing the data dictionary, extracting basic descriptions of the respective
data sets, conducting time-based comparisons, surfacing testable hypotheses from data
inferences, and conducting other data explorations. This introduces initial data analysis work
only, but this does not preclude front-end analysis of courses at the micro level, relational
database queries of the data, and other potential follow-on work.
2
PRESENTATION ORDER
• K-State Online Canvas LMS data
portal data
1. About Courses
2. About Course Sections
3. a. About Assignments
b. About Submitted Assignments
4. About Quizzes
5. About Discussion Boards
6. About Learner Submitted Files
7. About Uploaded Files
8. About Wikis and Wiki Pages
9. About Enrollment Role Types
3
PRESENTATION ORDER (CONT.)
10. About Groups
11. About Users and Workflow
States
12. About Course Level Grades
(based on Enrollments)
13. About Conversations (In-System
Emails)
14. About Third-Party External Tool
Activations
15. About Course User Interface (UI)
Navigation Item States
• Summary
4
K-STATE ONLINE CANVAS LMS DATA PORTAL DATA
5
6
7
K-STATE ONLINE CANVAS LMS DATA PORTAL DATA
• LMS data portal data comes from event logs and trace files captured as part of the
provision of learning management system (LMS) services
• These originate from SQL and include structured quantitative data and text data
• K-State data comes in 79 data tables, which download as .gz files in a zipped
folder
• .gz files are opened using 7Zip
• These are then files without extensions
• .csv (comma separated values format) may be added on the end to make the files readable
in Access, SQL, Excel, etc.
8
K-STATE ONLINE CANVAS LMS DATA PORTAL DATA
(CONT.)
• Data is updated every day with K-State’s Canvas license.
• The data is already in digital format.
• The data structures are in classic data table format with student data in rows and
variable data in columns.
• What default settings are chosen for an LMS instance will likely have a
weighty influence on the uses of the LMS functions.
9
DATA HANDLING
• There are millions of rows of data in some of the tables.
• The data has to be handled properly so there is no lossiness in the handling.
• As always, a pristine and “untouchable” (you can’t revise or change the raw set in any way)
raw set of data should be maintained as a hedge against data mishandling. A working set of
this data can be accessed for data cleaning and queries…
• The working set of data requires a little cleaning to be useful.
• Outlier data may have to be eliminated so as not to skew curves.
• Data garble should be omitted. Sometimes, garble is collected. Sometimes, people use the
LMS in a way that create garble.
10
A DATA DICTIONARY OR “SCHEMA DOCS”
• There is a data dictionary or “Schema Docs” that describe the unlabeled
column data. Not all variables mentioned in the data dictionary are used
because of different choices for different instances (what universities choose to
collect or not collect, use or not use)…and because of changes in data
management over time…and other factors. (Some data columns are
discontinued / “deprecated” / no longer supported.)
• Many of the categories of data are instantiated in multiple data tables, so these have to
be combined (union-joined) for queries involving the full set of N.
11
THIS DATA ANALYSIS APPROACH
• The available information and how it is queried can elicit insights for
awareness and decision-making.
• This approach here shows uses of computers to capture meaning. This does not
preclude some “human close reading,” but the amounts of data require
computers to some degree.
12
ADDITIONAL ANALYTICAL APPROACHES
• Data analysis never happens in a vacuum.
• The “owner” (“brand ambassador”) of the instance is a good source of information.
• The back-end information from the LMS data portal can be combined with
front-end accesses for deeper insights about how the LMS can be used.
• The digital data may be compared with other data from other sources
assuming unique identifier columns may be identified and used. (These include
primary and foreign key columns.)
13
ADDITIONAL ANALYTICAL APPROACHES (CONT.)
• If baseline data is available from other comparable and non-comparable
instances, those could be informative.
• Are there “proper” or normative levels of activities that should be observed for LMSes in
particular stages of a life cycle?
14
ANALYTICS APPROACHES
Given the use of flat files and without direct
application of primary and foreign keys…
Given database queries…
Given computation-based linguistic analysis tools…
Given qualitative data analysis tools…
What is knowable from the data?
What would benefit teaching and learning (T&L)?
15
ANALYTICS APPROACHES (CONT.)
DESCRIPTIVE
• How is the LMS instance being used
by the faculty, staff, and
administration at the university?
• How has the LMS instance been used
over time?
PRESCRIPTIVE
• What are ways to improve the
university’s uses of the LMS and the
related integrated tools?
• How can the LMS be used in a
beneficial way into the future?
16
PRESENTATION:
• 15 areas of insights in the Canvas LMS (with Oct. 2016 data) expressed as
data visualizations
• Ways to interpret the available data (in terms of macro-level use of the LMS)
• Ways to harness that data for improving online teaching and learning with
LMS data and beyond
17
1. ABOUT COURSES
18
19
COURSE VISIBILITY AND T&L
• In the lead-up period to an academic term, are courses publicly viewable to
learners who may want to acclimate and get a start on the work?
• Follow-on questions:
• If so, what is available, and what can learners see and do?
20
21
COURSE WORKFLOW STATES AND T&L
• How many courses are “hard concluded” (completed) at a particular time of
the term (vs. how many should be)?
• “Claimed” courses are undeleted ones that have not yet been published, and
how many are in this state (which requires helpdesk or higher level support).
• Follow-on questions:
• Are there ways to head off accidental deletions of courses?
22
2. ABOUT COURSE SECTIONS
23
24
LIFE CYCLE STATE FOR COURSE SECTION AND T&L
• How many courses on the system are active vs. deleted?
• Follow-on questions:
• What is a healthy balance of active to deleted courses? Why?
• Online courses can be wholly recreated from ground-up without deletion…but deletion itself is also
low-cost. It might signal that a course may not be used again. If so, why?
• Sometimes, people delete courses because they don’t want the course to show up on their course
listings. How should that be handled instead (using the stars to select which courses initially show up
on the user dashboard).
• Sometimes, people delete courses willy-nilly, and that requires reinstating a deleted course.
25
26
DATE RESTRICTION ACCESSES FOR COURSE
SECTIONS AND T&L
• What are the proportion of course sections in the LMS instance that have
restricted enrollments to section availability dates?
• Follow-on questions:
• When is there the “use case” of restricted section access to defined dates? Is that the
best way to handle that “use case” / those “use cases”? Are there negative unintended
consequences of using such features or not using such features?
27
28
ABILITY TO SELF-ENROLL IN A SECTION OR NOT
AND T&L
• There are two general types of sections when it comes to online courses. One
is a general section linked to a formal course. These are created based on
learner enrollments in an online enrollment system.
• Others are sections created by instructors to enable segmentation of courses
for different assignments or tracks.
• Manual self-enrollment to a section applies to instructor-created sections with opt-in
learning tracks. Assignment to sections also apply to instructor-created sections.
29
ABILITY TO SELF-ENROLL IN A SECTION OR NOT
AND T&L (CONT.)
• Follow-on questions:
• How important is it for learners to be able to select their own sections in respective
learning sequences?
• What are the methods that instructors use to assign learners to sections (can either be
random or manually)?
30
3A. ABOUT ASSIGNMENTS
31
32
TYPES OF ASSIGNMENTS AND T&L
• Is there a baseline count for assignment types? Are there optimal mixes for a
university of K-State’s size?
• What does the frequency of the various assignment types mean for how the
assignments are being used at the university?
• What is the proportion of graded to ungraded assignments, and which are
preferable when?
• Are ungraded assignments used for formative assessments to support learning, and are
these opt-in or required? (and if so, in what contexts?)
33
TYPES OF ASSIGNMENTS AND T&L (CONT.)
• Follow-on questions:
• How can a university’s administration and staff encourage more media creation
assignment types (if relevant to the learning)?
34
35
TIME FEATURES FOR ASSIGNMENTS AND T&L
• What types of assignments fit into each category of time features?
• Why are so many assignments without time deadlines or allotments?
• Is this a negative in an LMS that has an auto-created calendar and auto-created syllabus
based in part on deadlines?
• How do instructors use deadlines on assignments? And the converse: How do instructors
not use deadlines on assignments?
• Is it beneficial to learners to have no deadlines or some deadlines or all-
defined deadlines? And for which assignments?
36
TIME FEATURES FOR ASSIGNMENTS AND T&L (CONT.)
• Follow-on questions:
• How soon before an assignment is due is it unlocked? Would learners do better if they
had more time to prepare for an assignment before its unlocking?
37
38
MAIN THEMES AUTO-IDENTIFIED IN ASSIGNMENT
NAMES AND T&L
• In terms of assignment names, what are the most frequent words used?
• What do these (particularly subtopics) suggest about the work that learners do?
• Follow-on questions:
• What are some “long tail” terms from this set of assignment names? The terms in the
“long tail” are those used only infrequently.
• What about controversial terms used in the assignment names?
39
40
SOME LINGUISTIC FEATURES OF THE ASSIGNMENT
TITLES AND DESCRIPTIONS AND T&L
• Assignment titles and descriptors rank very high on analytic features (92nd
percentile). They rank in the 73rd percentile in clout and 65th percentile on
tone, but on the 12th percentile for authentic tone (warmth).
• Follow-on questions:
• Are there ways the improve the sense of human warmth in assignment descriptions?
Would that be beneficial or harmful for the learning?
41
42
DELVING INTO TOPICS OF INTEREST AND T&L
• It is possible to select terms and phrases (unigrams / one-grams, bigrams /
two-grams, three-grams, four-grams, etc.) to explore in the text set, to see the
words leading up to the target terms and the terms leading away. In the
prior word tree, “lab” was the target term.
• Follow-on questions:
• In assignment titles and assignment descriptions, any number of terms may be of interest.
43
44
GRADES VIEWABLE BY STUDENTS? MUTED VS.
UNMUTED ASSIGNMENTS AND T&L
• A minority of student assignments’ grades are muted (whether temporarily or
permanently) in a course. Also, in some courses, some or all of the grades
may be muted.
• What are some ways that instructors use grade muting?
• What does grade muting enable bureaucratically or otherwise (such as for grade
adjustments)?
• How may grade muting enable learners to learn with less self-imposed or other imposed
pressure?
45
GRADES VIEWABLE BY STUDENTS? MUTED VS.
UNMUTED ASSIGNMENTS AND T&L (CONT.)
• Follow-on questions:
• How aware are instructors of the assignment / quiz grade muting and unmuting functions?
What about learners?
• Do learners miss out on the benefits of assigned grade and other feedback if assignment
muting is applied?
• Do learners miss out on the benefits of not seeing assigned grades, such as less pressure
and less anxiety, if grade muting is not applied?
46
47
ASSIGNMENT WORKFLOW STATES AND T&L
• Why are assignments put into published, unpublished, and deleted states?
• Follow-on questions:
• Are there ways to improve the teaching and learning experience for learners by making
assignments more readily available in a sequence such as in modules?
• Are there ways to improve the learning experience by rolling out learning in time, so
learners are not overwhelmed by an entire revealed course at the beginning?
48
49
SURVIVAL FUNCTION OF ASSIGNMENTS TO
UPDATE AND T&L
• How long do assignments “survive” before they are updated?
• What does it mean that some assignments may not have been updated for over a
thousand days (a little less than three years)?
• What does it mean that assignments that are updated tend to be updated
shortly after they were created? Does this mean better quality of update or
not?
50
SURVIVAL FUNCTION OF ASSIGNMENTS TO
UPDATE AND T&L
• Follow-on questions:
• How quickly are assignments used after they are created? How often are assignments
updated as soon as there is feedback from learners?
• How many assignments are never updated after the first point-of-creation? What are
the proportions?
• Why do instructors and GTAs update assignments?
• What are the most common updates to assignments? Are these improvements beneficial
to the learning or not?
51
3B. ABOUT SUBMITTED ASSIGNMENTS
52
53
GRADES SUBMITTAL COUNTS FOR COMPLETED
ASSIGNMENTS AND T&L
• A majority of the submitted assignments have received grades, but a not-
unsizable amount have not.
• Follow-on questions:
• How quickly do learners expect grades to arrive? What actions do they take if grades
haven’t arrived within a certain amount of time?
• How often do they check their grades?
• How much weight do learners give to the grades they receive?
54
4. ABOUT QUIZZES
55
56
A SURVEY OF QUIZ TYPES AND T&L
• In terms of quiz types, a majority are “assignment” types, then practice
quizzes, then graded surveys, and then surveys. What are the respective
functionalities of each? How many at a campus are aware of the various
types of quizzes and their respective functionalities?
• Follow-on questions:
• How do the uses of the LMS instantiate the various types of quizzes? What are some
constructive models for the respective uses of each?
57
58
QUIZ QUESTION TYPES IN THE LMS INSTANCE AND
T&L
• A majority of question types used in the LMS are based on automated
assessment. Some—essay questions, file uploads, short answer questions, text
only questions—often require human interventions.
• Follow-on questions:
• What sorts of assignments use human-intervention-type questions?
• Is there higher quality feedback with non-automated types of assessments by experts (vs.
GTAs)?
59
60
QUIZ QUESTION WORKFLOW STATES AND T&L
• A majority of quiz questions are unpublished. Some are published. A small
amount are deleted.
• Follow-on questions:
• What quiz questions are deleted, and why? Are these replaced?
• How many of the quiz questions are from third-party content providers?
61
62
AN INCLUSIVE SCATTERPLOT OF QUIZ POINT
VALUES AND T&L
• The min-max range on quiz point values is 0 – 23,000.
• A majority of quiz values are very low comparatively.
• The average point value of a quiz in the LMS is 33 points (without zeroes
integrated) and 28 points (with zeroes integrated).
63
AN INCLUSIVE SCATTERPLOT OF QUIZ POINT
VALUES AND T&L (CONT.)
• Follow-on questions:
• What do you have to do to pass a 23,000 point quiz?!
• If defined, how much time is allowed / expected for a quiz?
• What does a low-value assessment look like? A high-value assessment?
• What are typical quiz designs? What are atypical quiz designs?
64
HISTOGRAM OF QUIZ
POINT VALUES IN LMS
INSTANCE (WITH A
NORMAL CURVE) AND
T&L
65
66
SURVIVAL CURVE OF DELETED QUIZZES AND T&L
• Quizzes that are ultimately deleted tend not to last very long.
• Follow-on questions:
• Why are quizzes deleted (instead of revised)?
67
68
ONE MINUS SURVIVAL FUNCTION CURVE FOR
DELETED QUIZZES AND T&L
• Quizzes that last a certain number of days tend to survive without being
deleted. Why?
• Follow-on questions:
• What is it that instructors look for in a quiz to ensure that they will continue to use it?
• Once instructors have committed to a quiz, how long will they tend to use that quiz for
without revision?
69
HAZARD FUNCTION
FOR DELETED
QUIZZES AND T&L
70
5. ABOUT DISCUSSION BOARDS
71
72
TYPES OF DISCUSSION BOARDS: ANNOUNCEMENT
VS. DEFAULT AND T&L
• Discussion topic types may be either “announcement” or “default” (blank). An
“announcement” type has text in the body; a “default” one just has a title but
no body text or prompt.
• Follow-on questions:
• How are announcement discussion boards set up for teaching and learning? How are
default discussion boards set up for teaching and learning?
• Do learners do better with more prompts for contents or not?
73
74
WORKFLOW STATES OF DISCUSSION BOARDS
AND T&L
• Discussion board topics may be in various states: unpublished, active, locked,
deleted, and post_delayed. When are these various states practically
applied in a live learning context? To what end?
• Follow-on questions:
• How are these various states of discussion boards used and instantiated in the LMS
instance? Which are beneficial to learning, and which not? Which are beneficial to
teaching, and which not?
75
76
ACTIVE VS. DELETED DISCUSSION BOARD ENTRIES
(REPLIES) AND T&L
• A majority of discussion board entries are left active and available, but a
minority are deleted.
• Follow-on questions:
• Which ones end up being deleted, and why?
• What information is lost with the deletion of discussion board entries (replies)?
77
6. ABOUT LEARNER SUBMITTED FILES
78
79
HANDLING OF LEARNER SUBMISSIONS AND T&L
• In one slice-in-time, a majority learner-submitted assignments (file uploads)
were graded, and a lesser amount was not graded. A small minority was
auto-graded (maybe code uploads corrected with a scripted auto-grader?).
• What types of assignments do instructors choose to auto-grade?
• What types of assignments do instructors choose to self-grade / manually
grade?
80
HANDLING OF LEARNER SUBMISSIONS AND T&L
(CONT.)
• Follow-on questions:
• What sorts of files are requested in file upload assignments? Diagrams? Maps? Photos?
Designs? Audio files? Video files? Papers? Others?
• What are some ways that learners benefit from having fewer auto-graded files? What are
some costs to using a fair amount of instructor-graded works?
• How quick of a turnaround do learners expect for learner-submitted assignments?
• How much feedback do they expect?
• How many of these are peer-assessed vs. GTA vs. self vs. instructor assessed?
• How many of these assignments are made public to the course learners, as in an online
gallery?
81
82
SOME COMMON WORDS FROM COMMENTS
MADE ON SUBMISSIONS AND T&L
• When learners submit file uploads, they often type text into the field accompanying the file
upload. The prior word cloud sounds frequency counts of words related to comments made with
the digital file submissions.
• Follow-on questions:
• What are substantive content terms used in the text learners use when uploading digital files?
• How positive or negative are the sentiments expressed as learners are uploading the files?
• What are the main purposes of the textual contents when learners share a message with their instructor when
uploading a file for an assignment?
• What are common questions when learners share a message with their instructor when uploading a file for an
assignment?
83
84
SUBMISSION COMMENT PARTICIPATION TYPE AND
T&L
• Submission comment types may be categorized into three types: admin,
author, and submitter.
• The admin may be the instructor.
• The author may be whomever wrote the message.
• The submitter may be whomever submitted the file. (The data dictionary does not seem
clear on this.)
85
SUBMISSION COMMENT PARTICIPATION TYPE AND
T&L (CONT.)
• Follow-on questions:
• What sorts of learning interactions go on around uploaded files?
• What are potential learning gains for the instructors? The co-learners? The target learner
who shared the file?
86
7. ABOUT UPLOADED FILES
87
88
UPLOADS AND REVISIONS OF FILES TO THE LMS
INSTANCE BY YEAR AND T&L
• The Files area works as a “loading dock” to the particular online course. It enables
the upload of a limited amount of digital files which may be pointed to from the
Pages, Syllabus, Modules, and other sections of the online course.
• Some instructors use the Files area directly for learners to download course information and
datasets.
• In the first four years of the LMS’s use at K-State (we’re in Y5 now), there have been
growing usage of this feature.
• This feature is bolstered by the use of Mediasite as a third-party video hosting site
and player (and desktop lecture capture tool).
89
UPLOADS AND REVISIONS OF FILES TO THE LMS
INSTANCE BY YEAR AND T&L (CONT.)
• Follow-on questions:
• What sorts of files are being uploaded? Content-wise, is there inherent teaching and
learning value?
• Depending on how the digital contents are harnessed for teaching and learning, what is
the learning value?
90
91
OBSERVED UPLOADED FILE TYPES AND T&L
• In descending order, the top 10 most popular file types uploaded into this
instance of Canvas were: .docx, .pdf, .jpg, .png, .pptx, .xlsx, .ppt, .zip, .dat,
and .xl. The top 10 file types include text files, image files, slideshow files,
folders of digital contents, data files, and spreadsheet files. Further on, there
are videos, web pages, audio files, and others.
• The uploaded file types may indicate what kinds of technologies learners
have access to in their learning.
92
OBSERVED UPLOADED FILE TYPES AND T&L (CONT.)
• Follow-on questions:
• How are the various file types used in respective assignments?
• What is the quality of the online learning contents that learners create?
• Are there ways to increase the multi-modality of the digital objects that learners create?
93
94
WORD CLOUD OF FILE CONTENTS (FROM THE
DESCRIPTIONS OF FILE CONTENTS) AND T&L
• File contents are the files uploaded through the “loading docks” as well as files
uploaded to help create individual profiles within the Canvas LMS (for the particular
instance).
• The word cloud on the prior slide shows the most frequent words mentioned in the
named uploaded files. The assumption is that the file names are informational (and
some are, others not).
• In this case, the long tail may be more informative…since the most frequent words
will be taken up with generic terms labeling what the files may be for.
95
WORD CLOUD OF FILE CONTENTS (FROM THE
DESCRIPTIONS OF FILE CONTENTS) AND T&L (CONT.)
• Follow-on questions:
• What file naming protocols can instructors use to be as specific and informative about file
contents as possible?
• If the Files area is used by instructors in a published way, what folder structure, folder
naming protocols, and folder names can be used to be as informative as possible?
96
HIGH FREQUENCY
WORD COUNTS IN THE
FILE NAMES SET (AS
ONEGRAMS /
UNIGRAMS) AND T&L
Some of the terms, like “reflection” and
“review” are indicative of pedagogical
awareness.
The references to “guide” and others are
also indicative of cognitive scaffolding.
97
8. ABOUT WIKIS AND WIKI PAGES
98
99
PARENT TYPES FOR WIKI PAGES AND T&L
• Parent types for wiki pages may be either “course” or “group.” A “course”
wiki page points to the pages created at the course level and that may be
shared in a modular or other context. A “group” wiki page points to pages
that learners in groups may have created or co-created. For example,
Groups can have home pages.
100
PARENT TYPES FOR WIKI PAGES AND T&L (CONT.)
• Follow-on questions:
• What are the purposes of wiki pages created at the “course” level? What sorts of digital
contents are included in each?
• What are the purposes of wiki pages created at the “group” level? What sorts of
digital contents are included in each?
• What learning activities are related to the respective wiki pages? How effective are
these learning activities in context?
101
102
WIKI PAGE WORKFLOW AND T&L
• The wiki page workflow state can be in four conditions: active, deleted,
unpublished, and null. “Null” may be a default state when a page exists but
has no contents, and such pages may be pre-made once groups are created,
for example. (The documentation is not clear.)
• An active page is one that has been created and can be viewed.
• A deleted page is one that no longer is available.
• An unpublished page is one that is being drafted.
103
WIKI PAGE WORKFLOW AND T&L (CONT.)
• Follow-on questions:
• What are some creative wiki pages that may be an inspiration to other teachers and
learners?
• What are some functions of wiki pages that people do not use often like
• iframes (inline frames)
• embedded video
• third-party software integrations (like Twitter streams), or others?
• How can more creative work on the pages be encouraged?
104
105
WORD FREQUENCY WORD CLOUD FROM WIKI
PAGE TITLES AND T&L
• Wiki page titles refer to the text-based names per each page. Those are required
fields. The word cloud on the prior slide refers to the most frequent terms found in
this text set of titles (treated as a “bag of words”).
• A glance at the word cloud shows action words related to learning: studio, lab, teaching,
project, design, clinical, experience, and public.
• There are subject words, too: physics, chemistry, grain, infant, and others.
• If nothing else, these are suggestive of some of what the pages do.
• In any word frequency count, the “long tail” of few or even single exemplars also will
have value here.
106
WORD FREQUENCY WORD CLOUD FROM WIKI
PAGE TITLES AND T&L (CONT.)
• Follow-on questions:
• How do these word clouds change over time? Are there differences between these word
frequency counts from wiki page titles year over year? Between departments? Between
domains?
• What about non-English terms in wiki page titles (given the affordances of UTF-8)?
107
9. ABOUT ENROLLMENT ROLE TYPES
108
ABOUT ENROLLMENT ROLE TYPES
Role Name Basic Role Type
Librarian TAEnrollment
StudentEnrollment StudentEnrollment
TeacherEnrollment TeacherEnrollment
TAEnrollment TAEnrollment
DesignerEnrollment DesignerEnrollment
ObserverEnrollment ObserverEnrollment
Grader TAEnrollment
GradeObserver TAEnrollment
109
UNIVERSITY-DEFINED ROLES AND CAPABILITIES AND
T&L
• “Enrollment role types” are defined by the university. The “role name” is the
publicly facing side of the role, and the “basic role type” deals with the role-
based functionalities (built on the idea of “least privilege” or “give people
only as much access as they need so as not to compromise security”).
• For example, librarians have as much access as teaching assistants do, based on the
table.
• Do site users have all the access that they need for what they need to do?
• Is the system resilient against potential deletion of data? (sorta)
110
UNIVERSITY-DEFINED ROLES AND CAPABILITIES AND
T&L (CONT.)
• Follow-on questions:
• Are there super roles that need to be created beyond “admin”?
• Are there more circumscribed roles that need to be created beyond “observer”?
• Are there dedicated roles for one-off applications?
111
112
FREQUENCIES OF ENROLLMENT ROLES AND T&L
• The treemap on the prior slide shows “frequencies of enrollment roles,” with
the most popular roles in the following descending order: students, teachers,
studentview (observer), teaching assistant, and designer enrollments.
• Are the relative frequency counts / proportions correct?
• Follow-on questions:
• Are there new roles that need creating or current roles that need revision or tweaking?
113
114
TOP DOZEN COMPUTER SYSTEM CONFIGURATIONS FOR
ACCESSING LMS INSTANCE AND T&L
• This section captures a large amount of nuance of the computer systems used
to connect to Canvas for the teaching and learning. These inform what
technologies should be used to build digital learning objects, the types of
outputs that should be done, and the accommodations to ensure playability
and accessibility.
• Based on the technologies that people use, digital learning objects need to be
tested on Firefox browser and on mobile devices. However, there’s more…
115
TOP DOZEN COMPUTER SYSTEM CONFIGURATIONS FOR
ACCESSING LMS INSTANCE AND T&L (CONT.)
• Follow-on questions:
• What are all the main ways that people use to connect to the K-State instance of
Canvas? How can universal design be applied to ensure that they can all access the
contents in the most accessible way possible?
• Also, how can the Canvas apps for iOS and Android be designed to for the optimal
small-screen access?
116
117
REQUEST TYPES AND T&L
• The most common types of activities on the Canvas site for the K-State
instance is “GET” and “POST.” In other words, people retrieve contents or
access or “read”, and they upload or “create” contents.
• Follow-on questions:
• There are two main request types. Are the other types necessary, and if so, how can their
use be encouraged?
118
10. ABOUT GROUPS
119
120
GROUP NAMES FREQUENCY WORD CLOUD AND
T&L
• Instructors use inspiring group names sometimes to rally their learners. The
word cloud on the prior slide shows the most common words found in the text
set of group names.
• Again, it is important to check the “long tail” of resulting pareto chart from
the data table with the word frequency counts.
• Follow-on questions:
• Learner motivation is important. What are ways to encourage instructors to have creative
and inclusive / respectful names for learner teams?
121
122
MODERATOR STATUS
OF LEARNERS IN
GROUPS AND T&L
The “moderator status” refers to assigned
leadership in learner groups. In the K-State
instance, very few instructors have decided to
go with student leaders in the groups,
preferring leadership to emerge (rather than
be assigned), apparently.
123
MODERATOR STATUS OF LEARNERS IN GROUPS
AND T&L (CONT.)
• Follow-on questions:
• When does it make sense to assign student leaders instead of have them emerge?
• Should “moderators” be trained? Would their work be supervised by the instructor?
• How can leadership be brought into play without learners feeling disenfranchised?
124
125
LEARNER MEMBERSHIP STATUS IN GROUPS AND
T&L
• Based on the numbers in the prior bar chart, learner membership statuses
were either accepted or deleted. There were none in process—none invited
without an answer, and none “requested” without approval or disapproval.
• Instructors may not be using groups that people can apply to membership in,
and if this feature is beneficial to learning, it may be important to explore this
feature and put it into play strategically and tactically.
126
LEARNER MEMBERSHIP STATUS IN GROUPS AND
T&L (CONT.)
• Follow-on questions:
• How well are groups being used in an LMS? Do they create a sense of camaraderie and
collegiality that my be supportive of learning?
• What is the level of collaborative creative work in learner groups?
• What is the level of discourse in learner groups?
• What are the social dynamics in learner groups?
127
11. ABOUT USERS AND WORKFLOW STATES
128
129
USER “WORKFLOW” STATES AND T&L
• Users, the people who are in an LMS instance, may have their accounts in one of four
stages: creation_pending, deleted, pre_registered, and registered.
• As may be seen in the prior slide’s piechart, very few are waiting to have their
accounts approved (and there are internal organizational processes for that).
• Some have been deleted (also based on internal policies and practices).
• Some who have been pre-registered, maybe as visiting high school students or
external collaborators on online courses and trainings, are in the instance in not-
insignificant numbers. (It’s not quite clear in the data dictionary what the respective
roles may be.)
130
USER “WORKFLOW” STATES AND T&L (CONT.)
• Follow-on questions:
• Are people being processed in- or out- in sufficiently fast and efficient ways for teaching
and learning?
• For learners who need extended access, are there systems and policies and trained
people in place to meet their needs?
131
132
YEARS OF ORIGINATION OF USER ACCOUNTS AND
T&L
• The years of origination show a major push in 2014 to get everyone on the
system.
• Then, it seems that there may be a more baseline of learners going into the
online system.
• Not all students have online accounts, but many F2F courses and blended
courses also use the LMS. Also, there are non-course uses of the LMS.
133
YEARS OF ORIGINATION OF USER ACCOUNTS AND
T&L (CONT.)
• Follow-on questions:
• Based on knowledge of the campus and its flow of people, how well are they being
integrated into the affordances of the LMS? Is there enough technical support and other
support to ensure that users’ needs are met?
134
135
RETIRED ACCOUNTS = REGISTERED FALSE AND T&L
• The word cloud on the prior slide gives a light human sense of those who have
been off-ramped from the LMS.
• The uses of the first names make these impossible to re-identify, but data from
the long tail with the rare last names may make these somewhat re-
identifiable (so as always with data, use with care).
136
RETIRED ACCOUNTS = REGISTERED FALSE AND T&L
(CONT.)
• Follow-on questions:
• Whose accounts are being retired, and why?
137
138
CREATED PSEUDONYMS AND T&L
• “Pseudonyms” are the “logins associated with users,” which can enable
integrations with other databases to capture information at the individual
human level.
• In the pseudonyms category, there can be other identifiers tied to a person.
• These may be used to anonymize individuals to enable data extraction and
research without the risk of leaking personally identifiable information (PII).
139
CREATED PSEUDONYMS AND T&L (CONT.)
• Follow-on questions:
• This area is related more with developer and DBA work.
140
141
CURRENT “STATES” OF PSEUDONYMS AND T&L
• 97% of pseudonyms are active, and 3% are deleted.
• Proper management of pseudonyms means that those who are active users
should be included, and those who no longer are should have their
pseudonyms deleted, to ensure that everything is clear.
142
CURRENT “STATES” OF PSEUDONYMS AND T&L (CONT.)
• Follow-on questions:
• How accurately maintained are the pseudonyms?
• Are they set up as accurately as possible?
143
12. ABOUT COURSE LEVEL GRADES (BASED ON
ENROLLMENTS)
144
145
NUMBERS OF ATTEMPTS FOR LATEST SUBMITTED
ASSIGNMENTS AND T&L
• The numbers of attempts on assignments tends towards only one or two
attempts, for assignments that enable more than one submittal.
• A majority are “null,” which may suggest that there is only a one-time
submittal.
146
NUMBERS OF ATTEMPTS FOR LATEST SUBMITTED
ASSIGNMENTS AND T&L (CONT.)
• Follow-on questions:
• Is it positive or negative to have multiple assignment submittals? In some cases, positive.
In others, negative?
147
13. ABOUT CONVERSATIONS (IN-SYSTEM EMAILS)
148
149
CONVERSATIONS WITH MEDIA OBJECTS INCLUDED
AND T&L
• In Canvas, “conversations” are internal emails within the system.
• It is possible to attach media objects on these emails.
• Only 49/100,000 or 4 in 10,000 conversations in the Canvas LMS instance
contain a media object.
150
CONVERSATIONS WITH MEDIA OBJECTS INCLUDED
AND T&L (CONT.)
• Follow-on questions:
• When do media objects (audio, video, multimedia, and others) add learning value in
internal emails?
151
152
CONVERSATIONS W/ OR WITHOUT ATTACHMENTS
AND T&L
• 11% of emails in the Canvas instance contain attachments; 89% do not.
• These attachments are any sort of attachment, such as a text file or other file,
including “media objects” (like audio files, video files, etc.).
• Some attachments to “conversations” are assignments to learners from
instructors. Some attachments are homework assignments by learners to
instructors. Some are notes between learners. There are many other use
cases for attachments to conversations in the LMS.
153
CONVERSATIONS W/ OR WITHOUT ATTACHMENTS
AND T&L (CONT.)
• Follow-on questions:
• Attachments (articles, slideshows, images, and media objects) can often be highly
valuable for learning. How can instructors be encouraged to add these when relevant
(without overwhelming learners with too much information)?
154
155
ORIGINS OF CONVERSATIONS / MESSAGES AND
T&L
• The “origins of conversations” may be either human or system-generated. It is
possible that the auto-generated messaging by email was not turned on,
which explains the reason why there are none.
• Follow-on questions:
• Are there potential benefits to auto-generated conversations? (There already are such
emails to @k-state.edu / @ksu.edu emails, but should auto-generated conversations exist
for the LMS instance as well? Or would this just eat up memory and frustrate people?)
156
157
CONVERSATION MESSAGES WORD FREQUENCY
COUNT AND T&L
• What do people in an LMS converse about in the internal email? All sorts of
education concerns, of course (as may be seen in the prior word cloud).
• Follow-on questions:
• What communications do learners handle in-LMS vs. through other means (Like university
email systems? Like social media? Like discussion boards? Like web conferencing
sessions?) Are the communications set up as efficiently as possible for learning value?
158
159
MASS CONVERSATION MESSAGE CONTENTS AND
T&L
• In general, at the macro level of the LMS instance over the four years of the analysis,
the conversations tend to be analytical, power-based, and positive in tone, but the
authenticity of the communications tends to be low with authentic meaning “honest,
personal, and disclosing” with higher numbers and “a more guarded, distanced form
of discourse” with lower authentic numbers.
• Follow-on questions:
• Is academic speech more analytical, power-based, and positive in sentiment but more
guarded? Maybe. Those who want to succeed may have to be this way.
160
161
MESSAGING ABOUT “HUMAN DRIVES” IN THE
MASS CONVERSATION MESSAGES AND T&L
• The human drives in the expressed conversations on the LMS instance showed
a tendency toward affiliation, power, and reward. There is less push for
achievement and little sense of risk-taking.
• Follow-on questions:
• Risk-taking by learners is thought to be positive if it enables people to be more confident
and active in their learning. Should the communications be more achievement oriented
and risk oriented?
162
163
SENTIMENT ANALYSIS OF SAMPLE OF
CONVERSATION MESSAGING AND T&L
• From the sample of conversation messaging, the messages were equal parts
“very positive” and “moderately negative.” Very few messages were “very
negative.”
• It seems constructive that people can be constructively positive and negative.
• The next step would be to analyze the body of the messages that were coded
to the various categories of sentiment.
• Of course, there is a risk of stumbling across sensitive information here in this part of the
LMS data portal.
164
SENTIMENT ANALYSIS OF SAMPLE OF
CONVERSATION MESSAGING AND T&L (CONT.)
• Follow-on questions:
• There has to be a balance of positive and maybe slightly negative sentiment in the
learning context to help people feel supported and cared about for learning. Instructors
who have been teaching a long time have a sense of the right balance. Looking at the
emails may be too intrusive…but the general point holds that people are more
responsive to positive supports than negative messaging.
165
166
AUTO-EXTRACTED THEME BASED HIERARCHY CHART
OF CONVERSATION MESSAGING SAMPLE & T&L
• Auto-extracted themes (from machine learning) capture a range of topics and
related sub-topics in the email conversations inside the Canvas LMS.
• The treemap shows some of these elements.
• Follow-on questions:
• Would online emails change if people knew how observable such conversations were
should anyone be interested?
167
168
AUTO-EXTRACTED THEMES FROM CONVERSATION
MESSAGING SAMPLE AND T&L
• Per the piechart on the prior slide, it looks like the conversations are often
time-based and study-based.
• Follow-on questions:
• Some instructors create discussion boards where people may post questions and get
answers back in a timely fashion. Having a central location for questions and answers
may make intercommunications more efficient.
169
170
CONTEXTS OF “HELP” USE FROM THE CONVERSATION
TEXT SET EXPRESSED IN A WORD TREE AND T&L
• It is possible to drill down into a text set to see the context of how a common
term is used (or even uncommon terms). The prior word tree used “help” as a
target term to see the context in which that term was used.
• Follow-on questions:
• Are there efficient ways to get help to learners from conversations without the heavy
burden of responding to every student ASAP via email?
171
14. ABOUT THIRD-PARTY EXTERNAL TOOL
ACTIVATIONS
172
173
NUMBERS OF EXTERNAL TOOL ACTIVATIONS AND
T&L
• With the advent of Learning Tools Interoperability (LTI), many organizations
and corporations have built tools that interconnect their respective resources
(online software, online services) with LMSes.
• These bridging tools are available to use in the Canvas LMS instance. The
third-party apps are free, but the actual online hosted resources vary in terms
of costs.
• Some resources are open-source, and others are copyrighted. Some are wholly free, and
others require subscriptions or payments.
174
NUMBERS OF EXTERNAL TOOL ACTIVATIONS AND
T&L (CONT.)
• Follow-on questions:
• What online learning resources would be beneficial to learners? (Khan Academy?
TEDEd? YouTubeED? Google Maps? Google Docs? GitHub?)
175
176
NAMED EXTERNAL TOOL ACTIVATIONS AND T&L
• The summary treemap shows the various named third-party tool activations in this
instance of Canvas.
• It is possible to use iframe code and embed text to bring in many of the same
resources without having to use the app activation.
• Follow-on questions:
• Which tools are being used and how?
• Which third-party apps and tools are being discontinued?
177
178
179
180
181
EXTERNAL TOOL ACTIVATIONS IN 2013 / 2014 /
2015 / 2016 AND T&L
• What are the external tools being activated based on the alphabetical
histogram bar charts in the prior few slides? What could explain some of the
year-over-year differences?
• Follow-on questions:
• How can instructors be encouraged to experiment with third-party apps that can add
functionality and contents to their teaching and learning?
182
15. ABOUT COURSE USER INTERFACE (UI)
NAVIGATION ITEM STATES
183
184
COURSE USER INTERFACE NAVIGATION ITEM STATE
AND T&L
• The course user interface navigation item refers to whether an object appears in the
left navigation of the course, whether it is “visible” or “hidden.”
• If this is a summary, it looks like a majority of the available navigational items are
showing; however, this may or may not be positive given that some people do not
hide navigational items that they do not use (and end up confusing students).
• Follow-on questions:
• What are ways to encourage instructors to use the LMS to its full functionality but to hide what
they do not use (to support teaching and learning, without adding to the cognitive load)?
185
SUMMARY
186
BACKGROUND KNOWLEDGE REQUIRED
• To activate this LMS data portal data, it helps to know something of the
following:
• The institution of higher education
• What is going on in the front-end of the LMS (through admin access)
• Understandings of online teaching and learning
• Understandings of what happens when a static course goes live with the animating
presences of people and actual teaching
187
CHALLENGES TO USING THIS LMS DATA PORTAL
DATA
• So far, it has been
• difficult to make the “business case” for value on campus
• difficult to query for learning value directly (such as learning sequences per learner in a
course)
• difficult political environments on a campus to change instructors’ usages of the LMS
(without stepping on others’ decision-making)
• But this effort has been an informal one…
188
EXTENDING LMS DATA PORTAL DATA FOR
TEACHING AND LEARNING
• Improving teaching and learning is hard work. There are political
implications to this effort on a campus (of course).
• Instructors and graduate teaching assistants (GTAs) at the front lines would
enhance the data analyses and are the ones who can most effectively apply
the data to improved teaching and learning; they need to be “onboarded”
for this work.
• The current research insights may be more broadly shared to strengthen the
application of the data.
189
FINDING COMPARABLE EXTERNAL DATA
• While the focus is on institutional improvements and support for teaching and
learning, it is possible to go “macro” and beyond the walls of the
university…to larger contexts.
• To that end, it would benefit to have…
• comparable data from other ~ institutions of higher education (in similar developmental
states of LMS rollout), and
• some sort of comparable baseline data.
190
SOFTWARE USED
• The software used for this presentation include the following: 7Zip, Gadwin
Printscreen, Microsoft Access, SQL Express, LIWC, NVivo 11 Plus, MS Excel
2016, MS Visio, Adobe Photoshop, MS PowerPoint, and others.
191
CONCLUSION AND CONTACT
• Dr. Shalin Hai-Jew
• iTAC, Kansas State University
• 212 Hale Library
• 785-532-5262
• shalin@k-state.edu
• The data visualizations (by the presenter) come from “Wrangling Big Data in a Small
Tech Ecosystem,” formally published in Summer 2017:
http://scalar.usc.edu/works/c2c-digital-magazine-spring--summer-2017/wrangling-
big-data-in-a-small-tech-ecosystem.
192

More Related Content

PDF
Using the Qualtrics Research Suite as a Training LMS
PDF
"Survival Analysis" for Online Learning Data
PDF
The K-State Online Canvas LMS Data Portal and Five Years of Activated Third-P...
PDF
Creating Effective Data Visualizations in Excel 2016: Some Basics
PDF
Using Decision Trees to Analyze Online Learning Data
PDF
Creating Effective Data Visualizations for Online Learning
PDF
X api introduction and acrossx solution (1)
PDF
DATA MINING IN EDUCATION : A REVIEW ON THE KNOWLEDGE DISCOVERY PERSPECTIVE
Using the Qualtrics Research Suite as a Training LMS
"Survival Analysis" for Online Learning Data
The K-State Online Canvas LMS Data Portal and Five Years of Activated Third-P...
Creating Effective Data Visualizations in Excel 2016: Some Basics
Using Decision Trees to Analyze Online Learning Data
Creating Effective Data Visualizations for Online Learning
X api introduction and acrossx solution (1)
DATA MINING IN EDUCATION : A REVIEW ON THE KNOWLEDGE DISCOVERY PERSPECTIVE

What's hot (18)

PPTX
XAPI and Machine Learning for Patient / Learner
PDF
Prospect for learning analytics to achieve adaptive learning model
PPTX
Learning Analytics for Adaptive Learning And Standardization
PPTX
Computer based data analysis
PDF
Slide 26 sept2017v2
PDF
P11 goonetilleke
PDF
Development of Personal Assistant System with Human Computer Interaction
PPTX
weHelp: A Reference Architecture for Social Recommender Systems
DOC
Introduction abstract
PPT
Edu 653 power point h. aldrich
PDF
PPTX
Prospect for learning analytics to achieve adaptive learning model
PDF
Data Mining Application in Advertisement Management of Higher Educational Ins...
PDF
Who Owns Faculty Data?: Fairness and transparency in UCLA's new academic HR s...
PPT
LinkedUp - Linked Data & Education
PPTX
Lessons from Data Science Program at Indiana University: Curriculum, Students...
PPTX
The talk at Twente University on 28 July 2014
PDF
Ijdms050304A SURVEY ON EDUCATIONAL DATA MINING AND RESEARCH TRENDS
XAPI and Machine Learning for Patient / Learner
Prospect for learning analytics to achieve adaptive learning model
Learning Analytics for Adaptive Learning And Standardization
Computer based data analysis
Slide 26 sept2017v2
P11 goonetilleke
Development of Personal Assistant System with Human Computer Interaction
weHelp: A Reference Architecture for Social Recommender Systems
Introduction abstract
Edu 653 power point h. aldrich
Prospect for learning analytics to achieve adaptive learning model
Data Mining Application in Advertisement Management of Higher Educational Ins...
Who Owns Faculty Data?: Fairness and transparency in UCLA's new academic HR s...
LinkedUp - Linked Data & Education
Lessons from Data Science Program at Indiana University: Curriculum, Students...
The talk at Twente University on 28 July 2014
Ijdms050304A SURVEY ON EDUCATIONAL DATA MINING AND RESEARCH TRENDS
Ad

Similar to Using Large-Scale LMS Data Portal Data to Improve Teaching and Learning (at K-State) (20)

PDF
Leveraging Flat Files from the Canvas LMS Data Portal at K-State
PDF
Poster: Using Large-Scale LMS Data Portal Data to Improve Teaching and Learn...
PPTX
Empowering the Instructor with Learning Analytics
PPTX
Understanding learning analytics
PPTX
Improving Student Achievement with New Approaches to Data
PPTX
Learner Analytics: Hype, Research and Practice in moodle
PPTX
Learning Analytics: Realizing their Promise in the California State University
PPTX
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...
PPTX
Improving Student Achievement with New Approaches to Data
PPT
2013 03 14 (educon2013) emadrid ucm elearning standards learning analytics
PDF
Learning Analytics (or: The Data Tsunami Hits Higher Education)
PPT
June 21 learning analytics overview
PDF
Unleashing Analytics in the Classroom
PPTX
Learning and Educational Analytics
PPTX
Bb world 2012 Suzan Harkness It’s all about the data increasing student succe...
PPTX
Learning Analytics: Realizing the Big Data Promise in the CSU
PPTX
Learning dashboards for actionable feedback: the (non)sense of chances of suc...
PPTX
A-LASI Getting started in learning analytics (Lockyer, Rogers and Dawson)
PPTX
Learning Analytics: New thinking supporting educational research
PPTX
Learner Analytics Panel Session: Deja-Vu all over again?
Leveraging Flat Files from the Canvas LMS Data Portal at K-State
Poster: Using Large-Scale LMS Data Portal Data to Improve Teaching and Learn...
Empowering the Instructor with Learning Analytics
Understanding learning analytics
Improving Student Achievement with New Approaches to Data
Learner Analytics: Hype, Research and Practice in moodle
Learning Analytics: Realizing their Promise in the California State University
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...
Improving Student Achievement with New Approaches to Data
2013 03 14 (educon2013) emadrid ucm elearning standards learning analytics
Learning Analytics (or: The Data Tsunami Hits Higher Education)
June 21 learning analytics overview
Unleashing Analytics in the Classroom
Learning and Educational Analytics
Bb world 2012 Suzan Harkness It’s all about the data increasing student succe...
Learning Analytics: Realizing the Big Data Promise in the CSU
Learning dashboards for actionable feedback: the (non)sense of chances of suc...
A-LASI Getting started in learning analytics (Lockyer, Rogers and Dawson)
Learning Analytics: New thinking supporting educational research
Learner Analytics Panel Session: Deja-Vu all over again?
Ad

More from Shalin Hai-Jew (20)

PDF
Number Line (used with an Absolute Values presentation)
PDF
Absolute Values (slideshow used with a number line)
PDF
Academic Grant Pursuits Newsletter - July 2028
PDF
Academic Grant Pursuits Newsletter - June 2028
PDF
Academic Grant Pursuits Newsletter - May 2028
PDF
Academic Grant Pursuits Newsletter - April 2028
PDF
Academic Grant Pursuits Newsletter - March 2028
PDF
Academic Grant Pursuits Newsletter - February 2028
PDF
Academic Grant Pursuits Newsletter - January 2028
PDF
Academic Grant Pursuits Newsletter - December 2027
PDF
Academic Grant Pursuits Newsletter - November 2027
PDF
Academic Grant Pursuits Newsletter - October 2027
PDF
Academic Grant Pursuits Newsletter - September 2027
PDF
Academic Grant Pursuits Newsletter - August 2027
PDF
Academic Grant Pursuits Newsletter - July 2027
PDF
Academic Grant Pursuits Newsletter - June 2027
PDF
Academic Grant Pursuits Newsletter - May 2027
PDF
Academic Grant Pursuits Newsletter - April 2027
PDF
Academic Grant Pursuits Newsletter - March 2027
PDF
Academic Grant Pursuits Newsletter - February 2027
Number Line (used with an Absolute Values presentation)
Absolute Values (slideshow used with a number line)
Academic Grant Pursuits Newsletter - July 2028
Academic Grant Pursuits Newsletter - June 2028
Academic Grant Pursuits Newsletter - May 2028
Academic Grant Pursuits Newsletter - April 2028
Academic Grant Pursuits Newsletter - March 2028
Academic Grant Pursuits Newsletter - February 2028
Academic Grant Pursuits Newsletter - January 2028
Academic Grant Pursuits Newsletter - December 2027
Academic Grant Pursuits Newsletter - November 2027
Academic Grant Pursuits Newsletter - October 2027
Academic Grant Pursuits Newsletter - September 2027
Academic Grant Pursuits Newsletter - August 2027
Academic Grant Pursuits Newsletter - July 2027
Academic Grant Pursuits Newsletter - June 2027
Academic Grant Pursuits Newsletter - May 2027
Academic Grant Pursuits Newsletter - April 2027
Academic Grant Pursuits Newsletter - March 2027
Academic Grant Pursuits Newsletter - February 2027

Recently uploaded (20)

PPTX
master seminar digital applications in india
PPTX
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PPTX
Presentation on HIE in infants and its manifestations
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Anesthesia in Laparoscopic Surgery in India
PDF
A systematic review of self-coping strategies used by university students to ...
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
Computing-Curriculum for Schools in Ghana
PPTX
Institutional Correction lecture only . . .
master seminar digital applications in india
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
Microbial diseases, their pathogenesis and prophylaxis
Final Presentation General Medicine 03-08-2024.pptx
Presentation on HIE in infants and its manifestations
Module 4: Burden of Disease Tutorial Slides S2 2025
102 student loan defaulters named and shamed – Is someone you know on the list?
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
VCE English Exam - Section C Student Revision Booklet
Anesthesia in Laparoscopic Surgery in India
A systematic review of self-coping strategies used by university students to ...
Chinmaya Tiranga quiz Grand Finale.pdf
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Microbial disease of the cardiovascular and lymphatic systems
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Computing-Curriculum for Schools in Ghana
Institutional Correction lecture only . . .

Using Large-Scale LMS Data Portal Data to Improve Teaching and Learning (at K-State)

  • 1. USING LARGE-SCALE LMS DATA PORTAL DATA TO IMPROVE TEACHING AND LEARNING (AT K-STATE) INTERNATIONAL SYMPOSIUM ON INNOVATIVE TEACHING AND LEARNING AND ITS APPLICATION TO DIFFERENT DISCIPLINES DIGITAL POSTER SESSION SEPT. 26 – 27, 2017 KANSAS STATE UNIVERSITY – TEACHING & LEARNING CENTER
  • 2. SESSION DESCRIPTION • With any learning management system, a byproduct of its function is data, which may be analyzed to improve awareness, decision-making, and actions. At Kansas State University, its Canvas LMS instance recently made available its cumulative data from its first use in 2013. These flat files open a window to how the university is harnessing its LMS, with some macro- level insights that may suggest some areas to improve teaching and learning. This session describes some approaches to informatizing this empirical “big data” with some basic approaches: reviewing the data dictionary, extracting basic descriptions of the respective data sets, conducting time-based comparisons, surfacing testable hypotheses from data inferences, and conducting other data explorations. This introduces initial data analysis work only, but this does not preclude front-end analysis of courses at the micro level, relational database queries of the data, and other potential follow-on work. 2
  • 3. PRESENTATION ORDER • K-State Online Canvas LMS data portal data 1. About Courses 2. About Course Sections 3. a. About Assignments b. About Submitted Assignments 4. About Quizzes 5. About Discussion Boards 6. About Learner Submitted Files 7. About Uploaded Files 8. About Wikis and Wiki Pages 9. About Enrollment Role Types 3
  • 4. PRESENTATION ORDER (CONT.) 10. About Groups 11. About Users and Workflow States 12. About Course Level Grades (based on Enrollments) 13. About Conversations (In-System Emails) 14. About Third-Party External Tool Activations 15. About Course User Interface (UI) Navigation Item States • Summary 4
  • 5. K-STATE ONLINE CANVAS LMS DATA PORTAL DATA 5
  • 6. 6
  • 7. 7
  • 8. K-STATE ONLINE CANVAS LMS DATA PORTAL DATA • LMS data portal data comes from event logs and trace files captured as part of the provision of learning management system (LMS) services • These originate from SQL and include structured quantitative data and text data • K-State data comes in 79 data tables, which download as .gz files in a zipped folder • .gz files are opened using 7Zip • These are then files without extensions • .csv (comma separated values format) may be added on the end to make the files readable in Access, SQL, Excel, etc. 8
  • 9. K-STATE ONLINE CANVAS LMS DATA PORTAL DATA (CONT.) • Data is updated every day with K-State’s Canvas license. • The data is already in digital format. • The data structures are in classic data table format with student data in rows and variable data in columns. • What default settings are chosen for an LMS instance will likely have a weighty influence on the uses of the LMS functions. 9
  • 10. DATA HANDLING • There are millions of rows of data in some of the tables. • The data has to be handled properly so there is no lossiness in the handling. • As always, a pristine and “untouchable” (you can’t revise or change the raw set in any way) raw set of data should be maintained as a hedge against data mishandling. A working set of this data can be accessed for data cleaning and queries… • The working set of data requires a little cleaning to be useful. • Outlier data may have to be eliminated so as not to skew curves. • Data garble should be omitted. Sometimes, garble is collected. Sometimes, people use the LMS in a way that create garble. 10
  • 11. A DATA DICTIONARY OR “SCHEMA DOCS” • There is a data dictionary or “Schema Docs” that describe the unlabeled column data. Not all variables mentioned in the data dictionary are used because of different choices for different instances (what universities choose to collect or not collect, use or not use)…and because of changes in data management over time…and other factors. (Some data columns are discontinued / “deprecated” / no longer supported.) • Many of the categories of data are instantiated in multiple data tables, so these have to be combined (union-joined) for queries involving the full set of N. 11
  • 12. THIS DATA ANALYSIS APPROACH • The available information and how it is queried can elicit insights for awareness and decision-making. • This approach here shows uses of computers to capture meaning. This does not preclude some “human close reading,” but the amounts of data require computers to some degree. 12
  • 13. ADDITIONAL ANALYTICAL APPROACHES • Data analysis never happens in a vacuum. • The “owner” (“brand ambassador”) of the instance is a good source of information. • The back-end information from the LMS data portal can be combined with front-end accesses for deeper insights about how the LMS can be used. • The digital data may be compared with other data from other sources assuming unique identifier columns may be identified and used. (These include primary and foreign key columns.) 13
  • 14. ADDITIONAL ANALYTICAL APPROACHES (CONT.) • If baseline data is available from other comparable and non-comparable instances, those could be informative. • Are there “proper” or normative levels of activities that should be observed for LMSes in particular stages of a life cycle? 14
  • 15. ANALYTICS APPROACHES Given the use of flat files and without direct application of primary and foreign keys… Given database queries… Given computation-based linguistic analysis tools… Given qualitative data analysis tools… What is knowable from the data? What would benefit teaching and learning (T&L)? 15
  • 16. ANALYTICS APPROACHES (CONT.) DESCRIPTIVE • How is the LMS instance being used by the faculty, staff, and administration at the university? • How has the LMS instance been used over time? PRESCRIPTIVE • What are ways to improve the university’s uses of the LMS and the related integrated tools? • How can the LMS be used in a beneficial way into the future? 16
  • 17. PRESENTATION: • 15 areas of insights in the Canvas LMS (with Oct. 2016 data) expressed as data visualizations • Ways to interpret the available data (in terms of macro-level use of the LMS) • Ways to harness that data for improving online teaching and learning with LMS data and beyond 17
  • 19. 19
  • 20. COURSE VISIBILITY AND T&L • In the lead-up period to an academic term, are courses publicly viewable to learners who may want to acclimate and get a start on the work? • Follow-on questions: • If so, what is available, and what can learners see and do? 20
  • 21. 21
  • 22. COURSE WORKFLOW STATES AND T&L • How many courses are “hard concluded” (completed) at a particular time of the term (vs. how many should be)? • “Claimed” courses are undeleted ones that have not yet been published, and how many are in this state (which requires helpdesk or higher level support). • Follow-on questions: • Are there ways to head off accidental deletions of courses? 22
  • 23. 2. ABOUT COURSE SECTIONS 23
  • 24. 24
  • 25. LIFE CYCLE STATE FOR COURSE SECTION AND T&L • How many courses on the system are active vs. deleted? • Follow-on questions: • What is a healthy balance of active to deleted courses? Why? • Online courses can be wholly recreated from ground-up without deletion…but deletion itself is also low-cost. It might signal that a course may not be used again. If so, why? • Sometimes, people delete courses because they don’t want the course to show up on their course listings. How should that be handled instead (using the stars to select which courses initially show up on the user dashboard). • Sometimes, people delete courses willy-nilly, and that requires reinstating a deleted course. 25
  • 26. 26
  • 27. DATE RESTRICTION ACCESSES FOR COURSE SECTIONS AND T&L • What are the proportion of course sections in the LMS instance that have restricted enrollments to section availability dates? • Follow-on questions: • When is there the “use case” of restricted section access to defined dates? Is that the best way to handle that “use case” / those “use cases”? Are there negative unintended consequences of using such features or not using such features? 27
  • 28. 28
  • 29. ABILITY TO SELF-ENROLL IN A SECTION OR NOT AND T&L • There are two general types of sections when it comes to online courses. One is a general section linked to a formal course. These are created based on learner enrollments in an online enrollment system. • Others are sections created by instructors to enable segmentation of courses for different assignments or tracks. • Manual self-enrollment to a section applies to instructor-created sections with opt-in learning tracks. Assignment to sections also apply to instructor-created sections. 29
  • 30. ABILITY TO SELF-ENROLL IN A SECTION OR NOT AND T&L (CONT.) • Follow-on questions: • How important is it for learners to be able to select their own sections in respective learning sequences? • What are the methods that instructors use to assign learners to sections (can either be random or manually)? 30
  • 32. 32
  • 33. TYPES OF ASSIGNMENTS AND T&L • Is there a baseline count for assignment types? Are there optimal mixes for a university of K-State’s size? • What does the frequency of the various assignment types mean for how the assignments are being used at the university? • What is the proportion of graded to ungraded assignments, and which are preferable when? • Are ungraded assignments used for formative assessments to support learning, and are these opt-in or required? (and if so, in what contexts?) 33
  • 34. TYPES OF ASSIGNMENTS AND T&L (CONT.) • Follow-on questions: • How can a university’s administration and staff encourage more media creation assignment types (if relevant to the learning)? 34
  • 35. 35
  • 36. TIME FEATURES FOR ASSIGNMENTS AND T&L • What types of assignments fit into each category of time features? • Why are so many assignments without time deadlines or allotments? • Is this a negative in an LMS that has an auto-created calendar and auto-created syllabus based in part on deadlines? • How do instructors use deadlines on assignments? And the converse: How do instructors not use deadlines on assignments? • Is it beneficial to learners to have no deadlines or some deadlines or all- defined deadlines? And for which assignments? 36
  • 37. TIME FEATURES FOR ASSIGNMENTS AND T&L (CONT.) • Follow-on questions: • How soon before an assignment is due is it unlocked? Would learners do better if they had more time to prepare for an assignment before its unlocking? 37
  • 38. 38
  • 39. MAIN THEMES AUTO-IDENTIFIED IN ASSIGNMENT NAMES AND T&L • In terms of assignment names, what are the most frequent words used? • What do these (particularly subtopics) suggest about the work that learners do? • Follow-on questions: • What are some “long tail” terms from this set of assignment names? The terms in the “long tail” are those used only infrequently. • What about controversial terms used in the assignment names? 39
  • 40. 40
  • 41. SOME LINGUISTIC FEATURES OF THE ASSIGNMENT TITLES AND DESCRIPTIONS AND T&L • Assignment titles and descriptors rank very high on analytic features (92nd percentile). They rank in the 73rd percentile in clout and 65th percentile on tone, but on the 12th percentile for authentic tone (warmth). • Follow-on questions: • Are there ways the improve the sense of human warmth in assignment descriptions? Would that be beneficial or harmful for the learning? 41
  • 42. 42
  • 43. DELVING INTO TOPICS OF INTEREST AND T&L • It is possible to select terms and phrases (unigrams / one-grams, bigrams / two-grams, three-grams, four-grams, etc.) to explore in the text set, to see the words leading up to the target terms and the terms leading away. In the prior word tree, “lab” was the target term. • Follow-on questions: • In assignment titles and assignment descriptions, any number of terms may be of interest. 43
  • 44. 44
  • 45. GRADES VIEWABLE BY STUDENTS? MUTED VS. UNMUTED ASSIGNMENTS AND T&L • A minority of student assignments’ grades are muted (whether temporarily or permanently) in a course. Also, in some courses, some or all of the grades may be muted. • What are some ways that instructors use grade muting? • What does grade muting enable bureaucratically or otherwise (such as for grade adjustments)? • How may grade muting enable learners to learn with less self-imposed or other imposed pressure? 45
  • 46. GRADES VIEWABLE BY STUDENTS? MUTED VS. UNMUTED ASSIGNMENTS AND T&L (CONT.) • Follow-on questions: • How aware are instructors of the assignment / quiz grade muting and unmuting functions? What about learners? • Do learners miss out on the benefits of assigned grade and other feedback if assignment muting is applied? • Do learners miss out on the benefits of not seeing assigned grades, such as less pressure and less anxiety, if grade muting is not applied? 46
  • 47. 47
  • 48. ASSIGNMENT WORKFLOW STATES AND T&L • Why are assignments put into published, unpublished, and deleted states? • Follow-on questions: • Are there ways to improve the teaching and learning experience for learners by making assignments more readily available in a sequence such as in modules? • Are there ways to improve the learning experience by rolling out learning in time, so learners are not overwhelmed by an entire revealed course at the beginning? 48
  • 49. 49
  • 50. SURVIVAL FUNCTION OF ASSIGNMENTS TO UPDATE AND T&L • How long do assignments “survive” before they are updated? • What does it mean that some assignments may not have been updated for over a thousand days (a little less than three years)? • What does it mean that assignments that are updated tend to be updated shortly after they were created? Does this mean better quality of update or not? 50
  • 51. SURVIVAL FUNCTION OF ASSIGNMENTS TO UPDATE AND T&L • Follow-on questions: • How quickly are assignments used after they are created? How often are assignments updated as soon as there is feedback from learners? • How many assignments are never updated after the first point-of-creation? What are the proportions? • Why do instructors and GTAs update assignments? • What are the most common updates to assignments? Are these improvements beneficial to the learning or not? 51
  • 52. 3B. ABOUT SUBMITTED ASSIGNMENTS 52
  • 53. 53
  • 54. GRADES SUBMITTAL COUNTS FOR COMPLETED ASSIGNMENTS AND T&L • A majority of the submitted assignments have received grades, but a not- unsizable amount have not. • Follow-on questions: • How quickly do learners expect grades to arrive? What actions do they take if grades haven’t arrived within a certain amount of time? • How often do they check their grades? • How much weight do learners give to the grades they receive? 54
  • 56. 56
  • 57. A SURVEY OF QUIZ TYPES AND T&L • In terms of quiz types, a majority are “assignment” types, then practice quizzes, then graded surveys, and then surveys. What are the respective functionalities of each? How many at a campus are aware of the various types of quizzes and their respective functionalities? • Follow-on questions: • How do the uses of the LMS instantiate the various types of quizzes? What are some constructive models for the respective uses of each? 57
  • 58. 58
  • 59. QUIZ QUESTION TYPES IN THE LMS INSTANCE AND T&L • A majority of question types used in the LMS are based on automated assessment. Some—essay questions, file uploads, short answer questions, text only questions—often require human interventions. • Follow-on questions: • What sorts of assignments use human-intervention-type questions? • Is there higher quality feedback with non-automated types of assessments by experts (vs. GTAs)? 59
  • 60. 60
  • 61. QUIZ QUESTION WORKFLOW STATES AND T&L • A majority of quiz questions are unpublished. Some are published. A small amount are deleted. • Follow-on questions: • What quiz questions are deleted, and why? Are these replaced? • How many of the quiz questions are from third-party content providers? 61
  • 62. 62
  • 63. AN INCLUSIVE SCATTERPLOT OF QUIZ POINT VALUES AND T&L • The min-max range on quiz point values is 0 – 23,000. • A majority of quiz values are very low comparatively. • The average point value of a quiz in the LMS is 33 points (without zeroes integrated) and 28 points (with zeroes integrated). 63
  • 64. AN INCLUSIVE SCATTERPLOT OF QUIZ POINT VALUES AND T&L (CONT.) • Follow-on questions: • What do you have to do to pass a 23,000 point quiz?! • If defined, how much time is allowed / expected for a quiz? • What does a low-value assessment look like? A high-value assessment? • What are typical quiz designs? What are atypical quiz designs? 64
  • 65. HISTOGRAM OF QUIZ POINT VALUES IN LMS INSTANCE (WITH A NORMAL CURVE) AND T&L 65
  • 66. 66
  • 67. SURVIVAL CURVE OF DELETED QUIZZES AND T&L • Quizzes that are ultimately deleted tend not to last very long. • Follow-on questions: • Why are quizzes deleted (instead of revised)? 67
  • 68. 68
  • 69. ONE MINUS SURVIVAL FUNCTION CURVE FOR DELETED QUIZZES AND T&L • Quizzes that last a certain number of days tend to survive without being deleted. Why? • Follow-on questions: • What is it that instructors look for in a quiz to ensure that they will continue to use it? • Once instructors have committed to a quiz, how long will they tend to use that quiz for without revision? 69
  • 71. 5. ABOUT DISCUSSION BOARDS 71
  • 72. 72
  • 73. TYPES OF DISCUSSION BOARDS: ANNOUNCEMENT VS. DEFAULT AND T&L • Discussion topic types may be either “announcement” or “default” (blank). An “announcement” type has text in the body; a “default” one just has a title but no body text or prompt. • Follow-on questions: • How are announcement discussion boards set up for teaching and learning? How are default discussion boards set up for teaching and learning? • Do learners do better with more prompts for contents or not? 73
  • 74. 74
  • 75. WORKFLOW STATES OF DISCUSSION BOARDS AND T&L • Discussion board topics may be in various states: unpublished, active, locked, deleted, and post_delayed. When are these various states practically applied in a live learning context? To what end? • Follow-on questions: • How are these various states of discussion boards used and instantiated in the LMS instance? Which are beneficial to learning, and which not? Which are beneficial to teaching, and which not? 75
  • 76. 76
  • 77. ACTIVE VS. DELETED DISCUSSION BOARD ENTRIES (REPLIES) AND T&L • A majority of discussion board entries are left active and available, but a minority are deleted. • Follow-on questions: • Which ones end up being deleted, and why? • What information is lost with the deletion of discussion board entries (replies)? 77
  • 78. 6. ABOUT LEARNER SUBMITTED FILES 78
  • 79. 79
  • 80. HANDLING OF LEARNER SUBMISSIONS AND T&L • In one slice-in-time, a majority learner-submitted assignments (file uploads) were graded, and a lesser amount was not graded. A small minority was auto-graded (maybe code uploads corrected with a scripted auto-grader?). • What types of assignments do instructors choose to auto-grade? • What types of assignments do instructors choose to self-grade / manually grade? 80
  • 81. HANDLING OF LEARNER SUBMISSIONS AND T&L (CONT.) • Follow-on questions: • What sorts of files are requested in file upload assignments? Diagrams? Maps? Photos? Designs? Audio files? Video files? Papers? Others? • What are some ways that learners benefit from having fewer auto-graded files? What are some costs to using a fair amount of instructor-graded works? • How quick of a turnaround do learners expect for learner-submitted assignments? • How much feedback do they expect? • How many of these are peer-assessed vs. GTA vs. self vs. instructor assessed? • How many of these assignments are made public to the course learners, as in an online gallery? 81
  • 82. 82
  • 83. SOME COMMON WORDS FROM COMMENTS MADE ON SUBMISSIONS AND T&L • When learners submit file uploads, they often type text into the field accompanying the file upload. The prior word cloud sounds frequency counts of words related to comments made with the digital file submissions. • Follow-on questions: • What are substantive content terms used in the text learners use when uploading digital files? • How positive or negative are the sentiments expressed as learners are uploading the files? • What are the main purposes of the textual contents when learners share a message with their instructor when uploading a file for an assignment? • What are common questions when learners share a message with their instructor when uploading a file for an assignment? 83
  • 84. 84
  • 85. SUBMISSION COMMENT PARTICIPATION TYPE AND T&L • Submission comment types may be categorized into three types: admin, author, and submitter. • The admin may be the instructor. • The author may be whomever wrote the message. • The submitter may be whomever submitted the file. (The data dictionary does not seem clear on this.) 85
  • 86. SUBMISSION COMMENT PARTICIPATION TYPE AND T&L (CONT.) • Follow-on questions: • What sorts of learning interactions go on around uploaded files? • What are potential learning gains for the instructors? The co-learners? The target learner who shared the file? 86
  • 87. 7. ABOUT UPLOADED FILES 87
  • 88. 88
  • 89. UPLOADS AND REVISIONS OF FILES TO THE LMS INSTANCE BY YEAR AND T&L • The Files area works as a “loading dock” to the particular online course. It enables the upload of a limited amount of digital files which may be pointed to from the Pages, Syllabus, Modules, and other sections of the online course. • Some instructors use the Files area directly for learners to download course information and datasets. • In the first four years of the LMS’s use at K-State (we’re in Y5 now), there have been growing usage of this feature. • This feature is bolstered by the use of Mediasite as a third-party video hosting site and player (and desktop lecture capture tool). 89
  • 90. UPLOADS AND REVISIONS OF FILES TO THE LMS INSTANCE BY YEAR AND T&L (CONT.) • Follow-on questions: • What sorts of files are being uploaded? Content-wise, is there inherent teaching and learning value? • Depending on how the digital contents are harnessed for teaching and learning, what is the learning value? 90
  • 91. 91
  • 92. OBSERVED UPLOADED FILE TYPES AND T&L • In descending order, the top 10 most popular file types uploaded into this instance of Canvas were: .docx, .pdf, .jpg, .png, .pptx, .xlsx, .ppt, .zip, .dat, and .xl. The top 10 file types include text files, image files, slideshow files, folders of digital contents, data files, and spreadsheet files. Further on, there are videos, web pages, audio files, and others. • The uploaded file types may indicate what kinds of technologies learners have access to in their learning. 92
  • 93. OBSERVED UPLOADED FILE TYPES AND T&L (CONT.) • Follow-on questions: • How are the various file types used in respective assignments? • What is the quality of the online learning contents that learners create? • Are there ways to increase the multi-modality of the digital objects that learners create? 93
  • 94. 94
  • 95. WORD CLOUD OF FILE CONTENTS (FROM THE DESCRIPTIONS OF FILE CONTENTS) AND T&L • File contents are the files uploaded through the “loading docks” as well as files uploaded to help create individual profiles within the Canvas LMS (for the particular instance). • The word cloud on the prior slide shows the most frequent words mentioned in the named uploaded files. The assumption is that the file names are informational (and some are, others not). • In this case, the long tail may be more informative…since the most frequent words will be taken up with generic terms labeling what the files may be for. 95
  • 96. WORD CLOUD OF FILE CONTENTS (FROM THE DESCRIPTIONS OF FILE CONTENTS) AND T&L (CONT.) • Follow-on questions: • What file naming protocols can instructors use to be as specific and informative about file contents as possible? • If the Files area is used by instructors in a published way, what folder structure, folder naming protocols, and folder names can be used to be as informative as possible? 96
  • 97. HIGH FREQUENCY WORD COUNTS IN THE FILE NAMES SET (AS ONEGRAMS / UNIGRAMS) AND T&L Some of the terms, like “reflection” and “review” are indicative of pedagogical awareness. The references to “guide” and others are also indicative of cognitive scaffolding. 97
  • 98. 8. ABOUT WIKIS AND WIKI PAGES 98
  • 99. 99
  • 100. PARENT TYPES FOR WIKI PAGES AND T&L • Parent types for wiki pages may be either “course” or “group.” A “course” wiki page points to the pages created at the course level and that may be shared in a modular or other context. A “group” wiki page points to pages that learners in groups may have created or co-created. For example, Groups can have home pages. 100
  • 101. PARENT TYPES FOR WIKI PAGES AND T&L (CONT.) • Follow-on questions: • What are the purposes of wiki pages created at the “course” level? What sorts of digital contents are included in each? • What are the purposes of wiki pages created at the “group” level? What sorts of digital contents are included in each? • What learning activities are related to the respective wiki pages? How effective are these learning activities in context? 101
  • 102. 102
  • 103. WIKI PAGE WORKFLOW AND T&L • The wiki page workflow state can be in four conditions: active, deleted, unpublished, and null. “Null” may be a default state when a page exists but has no contents, and such pages may be pre-made once groups are created, for example. (The documentation is not clear.) • An active page is one that has been created and can be viewed. • A deleted page is one that no longer is available. • An unpublished page is one that is being drafted. 103
  • 104. WIKI PAGE WORKFLOW AND T&L (CONT.) • Follow-on questions: • What are some creative wiki pages that may be an inspiration to other teachers and learners? • What are some functions of wiki pages that people do not use often like • iframes (inline frames) • embedded video • third-party software integrations (like Twitter streams), or others? • How can more creative work on the pages be encouraged? 104
  • 105. 105
  • 106. WORD FREQUENCY WORD CLOUD FROM WIKI PAGE TITLES AND T&L • Wiki page titles refer to the text-based names per each page. Those are required fields. The word cloud on the prior slide refers to the most frequent terms found in this text set of titles (treated as a “bag of words”). • A glance at the word cloud shows action words related to learning: studio, lab, teaching, project, design, clinical, experience, and public. • There are subject words, too: physics, chemistry, grain, infant, and others. • If nothing else, these are suggestive of some of what the pages do. • In any word frequency count, the “long tail” of few or even single exemplars also will have value here. 106
  • 107. WORD FREQUENCY WORD CLOUD FROM WIKI PAGE TITLES AND T&L (CONT.) • Follow-on questions: • How do these word clouds change over time? Are there differences between these word frequency counts from wiki page titles year over year? Between departments? Between domains? • What about non-English terms in wiki page titles (given the affordances of UTF-8)? 107
  • 108. 9. ABOUT ENROLLMENT ROLE TYPES 108
  • 109. ABOUT ENROLLMENT ROLE TYPES Role Name Basic Role Type Librarian TAEnrollment StudentEnrollment StudentEnrollment TeacherEnrollment TeacherEnrollment TAEnrollment TAEnrollment DesignerEnrollment DesignerEnrollment ObserverEnrollment ObserverEnrollment Grader TAEnrollment GradeObserver TAEnrollment 109
  • 110. UNIVERSITY-DEFINED ROLES AND CAPABILITIES AND T&L • “Enrollment role types” are defined by the university. The “role name” is the publicly facing side of the role, and the “basic role type” deals with the role- based functionalities (built on the idea of “least privilege” or “give people only as much access as they need so as not to compromise security”). • For example, librarians have as much access as teaching assistants do, based on the table. • Do site users have all the access that they need for what they need to do? • Is the system resilient against potential deletion of data? (sorta) 110
  • 111. UNIVERSITY-DEFINED ROLES AND CAPABILITIES AND T&L (CONT.) • Follow-on questions: • Are there super roles that need to be created beyond “admin”? • Are there more circumscribed roles that need to be created beyond “observer”? • Are there dedicated roles for one-off applications? 111
  • 112. 112
  • 113. FREQUENCIES OF ENROLLMENT ROLES AND T&L • The treemap on the prior slide shows “frequencies of enrollment roles,” with the most popular roles in the following descending order: students, teachers, studentview (observer), teaching assistant, and designer enrollments. • Are the relative frequency counts / proportions correct? • Follow-on questions: • Are there new roles that need creating or current roles that need revision or tweaking? 113
  • 114. 114
  • 115. TOP DOZEN COMPUTER SYSTEM CONFIGURATIONS FOR ACCESSING LMS INSTANCE AND T&L • This section captures a large amount of nuance of the computer systems used to connect to Canvas for the teaching and learning. These inform what technologies should be used to build digital learning objects, the types of outputs that should be done, and the accommodations to ensure playability and accessibility. • Based on the technologies that people use, digital learning objects need to be tested on Firefox browser and on mobile devices. However, there’s more… 115
  • 116. TOP DOZEN COMPUTER SYSTEM CONFIGURATIONS FOR ACCESSING LMS INSTANCE AND T&L (CONT.) • Follow-on questions: • What are all the main ways that people use to connect to the K-State instance of Canvas? How can universal design be applied to ensure that they can all access the contents in the most accessible way possible? • Also, how can the Canvas apps for iOS and Android be designed to for the optimal small-screen access? 116
  • 117. 117
  • 118. REQUEST TYPES AND T&L • The most common types of activities on the Canvas site for the K-State instance is “GET” and “POST.” In other words, people retrieve contents or access or “read”, and they upload or “create” contents. • Follow-on questions: • There are two main request types. Are the other types necessary, and if so, how can their use be encouraged? 118
  • 120. 120
  • 121. GROUP NAMES FREQUENCY WORD CLOUD AND T&L • Instructors use inspiring group names sometimes to rally their learners. The word cloud on the prior slide shows the most common words found in the text set of group names. • Again, it is important to check the “long tail” of resulting pareto chart from the data table with the word frequency counts. • Follow-on questions: • Learner motivation is important. What are ways to encourage instructors to have creative and inclusive / respectful names for learner teams? 121
  • 122. 122
  • 123. MODERATOR STATUS OF LEARNERS IN GROUPS AND T&L The “moderator status” refers to assigned leadership in learner groups. In the K-State instance, very few instructors have decided to go with student leaders in the groups, preferring leadership to emerge (rather than be assigned), apparently. 123
  • 124. MODERATOR STATUS OF LEARNERS IN GROUPS AND T&L (CONT.) • Follow-on questions: • When does it make sense to assign student leaders instead of have them emerge? • Should “moderators” be trained? Would their work be supervised by the instructor? • How can leadership be brought into play without learners feeling disenfranchised? 124
  • 125. 125
  • 126. LEARNER MEMBERSHIP STATUS IN GROUPS AND T&L • Based on the numbers in the prior bar chart, learner membership statuses were either accepted or deleted. There were none in process—none invited without an answer, and none “requested” without approval or disapproval. • Instructors may not be using groups that people can apply to membership in, and if this feature is beneficial to learning, it may be important to explore this feature and put it into play strategically and tactically. 126
  • 127. LEARNER MEMBERSHIP STATUS IN GROUPS AND T&L (CONT.) • Follow-on questions: • How well are groups being used in an LMS? Do they create a sense of camaraderie and collegiality that my be supportive of learning? • What is the level of collaborative creative work in learner groups? • What is the level of discourse in learner groups? • What are the social dynamics in learner groups? 127
  • 128. 11. ABOUT USERS AND WORKFLOW STATES 128
  • 129. 129
  • 130. USER “WORKFLOW” STATES AND T&L • Users, the people who are in an LMS instance, may have their accounts in one of four stages: creation_pending, deleted, pre_registered, and registered. • As may be seen in the prior slide’s piechart, very few are waiting to have their accounts approved (and there are internal organizational processes for that). • Some have been deleted (also based on internal policies and practices). • Some who have been pre-registered, maybe as visiting high school students or external collaborators on online courses and trainings, are in the instance in not- insignificant numbers. (It’s not quite clear in the data dictionary what the respective roles may be.) 130
  • 131. USER “WORKFLOW” STATES AND T&L (CONT.) • Follow-on questions: • Are people being processed in- or out- in sufficiently fast and efficient ways for teaching and learning? • For learners who need extended access, are there systems and policies and trained people in place to meet their needs? 131
  • 132. 132
  • 133. YEARS OF ORIGINATION OF USER ACCOUNTS AND T&L • The years of origination show a major push in 2014 to get everyone on the system. • Then, it seems that there may be a more baseline of learners going into the online system. • Not all students have online accounts, but many F2F courses and blended courses also use the LMS. Also, there are non-course uses of the LMS. 133
  • 134. YEARS OF ORIGINATION OF USER ACCOUNTS AND T&L (CONT.) • Follow-on questions: • Based on knowledge of the campus and its flow of people, how well are they being integrated into the affordances of the LMS? Is there enough technical support and other support to ensure that users’ needs are met? 134
  • 135. 135
  • 136. RETIRED ACCOUNTS = REGISTERED FALSE AND T&L • The word cloud on the prior slide gives a light human sense of those who have been off-ramped from the LMS. • The uses of the first names make these impossible to re-identify, but data from the long tail with the rare last names may make these somewhat re- identifiable (so as always with data, use with care). 136
  • 137. RETIRED ACCOUNTS = REGISTERED FALSE AND T&L (CONT.) • Follow-on questions: • Whose accounts are being retired, and why? 137
  • 138. 138
  • 139. CREATED PSEUDONYMS AND T&L • “Pseudonyms” are the “logins associated with users,” which can enable integrations with other databases to capture information at the individual human level. • In the pseudonyms category, there can be other identifiers tied to a person. • These may be used to anonymize individuals to enable data extraction and research without the risk of leaking personally identifiable information (PII). 139
  • 140. CREATED PSEUDONYMS AND T&L (CONT.) • Follow-on questions: • This area is related more with developer and DBA work. 140
  • 141. 141
  • 142. CURRENT “STATES” OF PSEUDONYMS AND T&L • 97% of pseudonyms are active, and 3% are deleted. • Proper management of pseudonyms means that those who are active users should be included, and those who no longer are should have their pseudonyms deleted, to ensure that everything is clear. 142
  • 143. CURRENT “STATES” OF PSEUDONYMS AND T&L (CONT.) • Follow-on questions: • How accurately maintained are the pseudonyms? • Are they set up as accurately as possible? 143
  • 144. 12. ABOUT COURSE LEVEL GRADES (BASED ON ENROLLMENTS) 144
  • 145. 145
  • 146. NUMBERS OF ATTEMPTS FOR LATEST SUBMITTED ASSIGNMENTS AND T&L • The numbers of attempts on assignments tends towards only one or two attempts, for assignments that enable more than one submittal. • A majority are “null,” which may suggest that there is only a one-time submittal. 146
  • 147. NUMBERS OF ATTEMPTS FOR LATEST SUBMITTED ASSIGNMENTS AND T&L (CONT.) • Follow-on questions: • Is it positive or negative to have multiple assignment submittals? In some cases, positive. In others, negative? 147
  • 148. 13. ABOUT CONVERSATIONS (IN-SYSTEM EMAILS) 148
  • 149. 149
  • 150. CONVERSATIONS WITH MEDIA OBJECTS INCLUDED AND T&L • In Canvas, “conversations” are internal emails within the system. • It is possible to attach media objects on these emails. • Only 49/100,000 or 4 in 10,000 conversations in the Canvas LMS instance contain a media object. 150
  • 151. CONVERSATIONS WITH MEDIA OBJECTS INCLUDED AND T&L (CONT.) • Follow-on questions: • When do media objects (audio, video, multimedia, and others) add learning value in internal emails? 151
  • 152. 152
  • 153. CONVERSATIONS W/ OR WITHOUT ATTACHMENTS AND T&L • 11% of emails in the Canvas instance contain attachments; 89% do not. • These attachments are any sort of attachment, such as a text file or other file, including “media objects” (like audio files, video files, etc.). • Some attachments to “conversations” are assignments to learners from instructors. Some attachments are homework assignments by learners to instructors. Some are notes between learners. There are many other use cases for attachments to conversations in the LMS. 153
  • 154. CONVERSATIONS W/ OR WITHOUT ATTACHMENTS AND T&L (CONT.) • Follow-on questions: • Attachments (articles, slideshows, images, and media objects) can often be highly valuable for learning. How can instructors be encouraged to add these when relevant (without overwhelming learners with too much information)? 154
  • 155. 155
  • 156. ORIGINS OF CONVERSATIONS / MESSAGES AND T&L • The “origins of conversations” may be either human or system-generated. It is possible that the auto-generated messaging by email was not turned on, which explains the reason why there are none. • Follow-on questions: • Are there potential benefits to auto-generated conversations? (There already are such emails to @k-state.edu / @ksu.edu emails, but should auto-generated conversations exist for the LMS instance as well? Or would this just eat up memory and frustrate people?) 156
  • 157. 157
  • 158. CONVERSATION MESSAGES WORD FREQUENCY COUNT AND T&L • What do people in an LMS converse about in the internal email? All sorts of education concerns, of course (as may be seen in the prior word cloud). • Follow-on questions: • What communications do learners handle in-LMS vs. through other means (Like university email systems? Like social media? Like discussion boards? Like web conferencing sessions?) Are the communications set up as efficiently as possible for learning value? 158
  • 159. 159
  • 160. MASS CONVERSATION MESSAGE CONTENTS AND T&L • In general, at the macro level of the LMS instance over the four years of the analysis, the conversations tend to be analytical, power-based, and positive in tone, but the authenticity of the communications tends to be low with authentic meaning “honest, personal, and disclosing” with higher numbers and “a more guarded, distanced form of discourse” with lower authentic numbers. • Follow-on questions: • Is academic speech more analytical, power-based, and positive in sentiment but more guarded? Maybe. Those who want to succeed may have to be this way. 160
  • 161. 161
  • 162. MESSAGING ABOUT “HUMAN DRIVES” IN THE MASS CONVERSATION MESSAGES AND T&L • The human drives in the expressed conversations on the LMS instance showed a tendency toward affiliation, power, and reward. There is less push for achievement and little sense of risk-taking. • Follow-on questions: • Risk-taking by learners is thought to be positive if it enables people to be more confident and active in their learning. Should the communications be more achievement oriented and risk oriented? 162
  • 163. 163
  • 164. SENTIMENT ANALYSIS OF SAMPLE OF CONVERSATION MESSAGING AND T&L • From the sample of conversation messaging, the messages were equal parts “very positive” and “moderately negative.” Very few messages were “very negative.” • It seems constructive that people can be constructively positive and negative. • The next step would be to analyze the body of the messages that were coded to the various categories of sentiment. • Of course, there is a risk of stumbling across sensitive information here in this part of the LMS data portal. 164
  • 165. SENTIMENT ANALYSIS OF SAMPLE OF CONVERSATION MESSAGING AND T&L (CONT.) • Follow-on questions: • There has to be a balance of positive and maybe slightly negative sentiment in the learning context to help people feel supported and cared about for learning. Instructors who have been teaching a long time have a sense of the right balance. Looking at the emails may be too intrusive…but the general point holds that people are more responsive to positive supports than negative messaging. 165
  • 166. 166
  • 167. AUTO-EXTRACTED THEME BASED HIERARCHY CHART OF CONVERSATION MESSAGING SAMPLE & T&L • Auto-extracted themes (from machine learning) capture a range of topics and related sub-topics in the email conversations inside the Canvas LMS. • The treemap shows some of these elements. • Follow-on questions: • Would online emails change if people knew how observable such conversations were should anyone be interested? 167
  • 168. 168
  • 169. AUTO-EXTRACTED THEMES FROM CONVERSATION MESSAGING SAMPLE AND T&L • Per the piechart on the prior slide, it looks like the conversations are often time-based and study-based. • Follow-on questions: • Some instructors create discussion boards where people may post questions and get answers back in a timely fashion. Having a central location for questions and answers may make intercommunications more efficient. 169
  • 170. 170
  • 171. CONTEXTS OF “HELP” USE FROM THE CONVERSATION TEXT SET EXPRESSED IN A WORD TREE AND T&L • It is possible to drill down into a text set to see the context of how a common term is used (or even uncommon terms). The prior word tree used “help” as a target term to see the context in which that term was used. • Follow-on questions: • Are there efficient ways to get help to learners from conversations without the heavy burden of responding to every student ASAP via email? 171
  • 172. 14. ABOUT THIRD-PARTY EXTERNAL TOOL ACTIVATIONS 172
  • 173. 173
  • 174. NUMBERS OF EXTERNAL TOOL ACTIVATIONS AND T&L • With the advent of Learning Tools Interoperability (LTI), many organizations and corporations have built tools that interconnect their respective resources (online software, online services) with LMSes. • These bridging tools are available to use in the Canvas LMS instance. The third-party apps are free, but the actual online hosted resources vary in terms of costs. • Some resources are open-source, and others are copyrighted. Some are wholly free, and others require subscriptions or payments. 174
  • 175. NUMBERS OF EXTERNAL TOOL ACTIVATIONS AND T&L (CONT.) • Follow-on questions: • What online learning resources would be beneficial to learners? (Khan Academy? TEDEd? YouTubeED? Google Maps? Google Docs? GitHub?) 175
  • 176. 176
  • 177. NAMED EXTERNAL TOOL ACTIVATIONS AND T&L • The summary treemap shows the various named third-party tool activations in this instance of Canvas. • It is possible to use iframe code and embed text to bring in many of the same resources without having to use the app activation. • Follow-on questions: • Which tools are being used and how? • Which third-party apps and tools are being discontinued? 177
  • 178. 178
  • 179. 179
  • 180. 180
  • 181. 181
  • 182. EXTERNAL TOOL ACTIVATIONS IN 2013 / 2014 / 2015 / 2016 AND T&L • What are the external tools being activated based on the alphabetical histogram bar charts in the prior few slides? What could explain some of the year-over-year differences? • Follow-on questions: • How can instructors be encouraged to experiment with third-party apps that can add functionality and contents to their teaching and learning? 182
  • 183. 15. ABOUT COURSE USER INTERFACE (UI) NAVIGATION ITEM STATES 183
  • 184. 184
  • 185. COURSE USER INTERFACE NAVIGATION ITEM STATE AND T&L • The course user interface navigation item refers to whether an object appears in the left navigation of the course, whether it is “visible” or “hidden.” • If this is a summary, it looks like a majority of the available navigational items are showing; however, this may or may not be positive given that some people do not hide navigational items that they do not use (and end up confusing students). • Follow-on questions: • What are ways to encourage instructors to use the LMS to its full functionality but to hide what they do not use (to support teaching and learning, without adding to the cognitive load)? 185
  • 187. BACKGROUND KNOWLEDGE REQUIRED • To activate this LMS data portal data, it helps to know something of the following: • The institution of higher education • What is going on in the front-end of the LMS (through admin access) • Understandings of online teaching and learning • Understandings of what happens when a static course goes live with the animating presences of people and actual teaching 187
  • 188. CHALLENGES TO USING THIS LMS DATA PORTAL DATA • So far, it has been • difficult to make the “business case” for value on campus • difficult to query for learning value directly (such as learning sequences per learner in a course) • difficult political environments on a campus to change instructors’ usages of the LMS (without stepping on others’ decision-making) • But this effort has been an informal one… 188
  • 189. EXTENDING LMS DATA PORTAL DATA FOR TEACHING AND LEARNING • Improving teaching and learning is hard work. There are political implications to this effort on a campus (of course). • Instructors and graduate teaching assistants (GTAs) at the front lines would enhance the data analyses and are the ones who can most effectively apply the data to improved teaching and learning; they need to be “onboarded” for this work. • The current research insights may be more broadly shared to strengthen the application of the data. 189
  • 190. FINDING COMPARABLE EXTERNAL DATA • While the focus is on institutional improvements and support for teaching and learning, it is possible to go “macro” and beyond the walls of the university…to larger contexts. • To that end, it would benefit to have… • comparable data from other ~ institutions of higher education (in similar developmental states of LMS rollout), and • some sort of comparable baseline data. 190
  • 191. SOFTWARE USED • The software used for this presentation include the following: 7Zip, Gadwin Printscreen, Microsoft Access, SQL Express, LIWC, NVivo 11 Plus, MS Excel 2016, MS Visio, Adobe Photoshop, MS PowerPoint, and others. 191
  • 192. CONCLUSION AND CONTACT • Dr. Shalin Hai-Jew • iTAC, Kansas State University • 212 Hale Library • 785-532-5262 • shalin@k-state.edu • The data visualizations (by the presenter) come from “Wrangling Big Data in a Small Tech Ecosystem,” formally published in Summer 2017: http://scalar.usc.edu/works/c2c-digital-magazine-spring--summer-2017/wrangling- big-data-in-a-small-tech-ecosystem. 192