On Sunday, July 5, we went for a hike in one of my favorite places on this planet (Stone Mountain State Park) on a section of what is becoming one of my favorite trails, the MST. One of the many reasons I love Stone Mountain is that it is such a photogenic rock:
But on this day, we skirted around the rock and headed for the base of the escarpment, just past Widows Creek.
I’ve used Strava for quite some time to track my bicycling efforts, and recently I’ve discovered that it is also pretty good for tracking hikes. So, I thought I’d track my Sunday “stroll” up the escarpment. Results below, and here.

With my father, I had done the hike before in reverse, one way — from the Blue Ridge Parkway down to the Stone Mountain backpackers’ parking lot. On the 5th I wanted to go up and back, hoping to turn around at the ruins of an old mountain shack.
As usual, we made it up the (as shown above) STEEP grade at a quick pace — I definitely prefer climbing to descending. I had thought the shack was at a lower elevation, so I kept thinking throughout the hike that I had somehow missed it thanks to the heavy vegetation (my previous hike had been during winter).
Lo and behold, we finally made it to the shack. I checked the Strava app, and we were roughly 4.8 miles from where we started.
I had purposely avoided checking the app until now, partly because I wanted to ignore the digital and focus on the analog world around me. I know, however, that I had been pushing my own pace due to the fact that I had the tracker going.
Quickly, I put my phone back in my pocket.
I briefly considered finishing the relatively short distance to the Blue Ridge Parkway, which would have given us a 12-mile day. Then, the drizzling started, and it was best to begin heading downhill as planned.
Not too long after beginning the return, the rain increased and the thunder rolled. Not too much lightning, but enough to pick up the pace in order to lose elevation as quickly as possible.
Long story short: We descended about twice as fast as I would have liked. Halfway back to the bottom, there was not a dry fiber of clothing on our bodies (rain gear would not have made a difference). I was drenched to the point that I began thinking about the fine print of my phone insurance plan.
And this, of course, brought the technological perspective of my experience to the forefront of my mind as I continued the descent. By now, I realize that the safest path for descent is directly in the middle of the channel of rainwater gushing down the middle of what is looking less like a trail and more like a waterslide. Everywhere else I step is a mudslide. At least under the rushing water there is usually relatively solid rock. Splish, splash, sploosh.
Being the designer I am — with a relatively constant third-person perspective of my own experiences — I can’t help but think about what this current path selection activity means in terms of a demonstrated performance of learning: knowledge, skills, abilities, the works. This gets me thinking about a few of my favorite things: learning, assessment, hybrid digital experiences.
I couldn’t wait to see my results on Strava. I wanted feedback, beautiful data visualizations, luscious maps, the whole nine yards (or ten miles, actually) — but I wanted to make sure I didn’t lose the activity due to the fact that I have no cell service out at Stone Mountain (it’s happened before).
Wait, that’s one of the other reasons why I love Stone Mountain so much – my phone doesn’t work out there!
Out here, emails don’t push vibrations into my pockets. I can actually enjoy my natural surroundings. I can enjoy slipping in the mud and appreciating the marginal core strength that I have that keeps me from blowing out a knee as I ski down a few feet of the descent on one soaked trail shoe.
So, really, where does unobtrusive assessment fit in to this obviously rich learning experience – this amazing opportunity to demonstrate my knowledge, skills, and abilities in so many ways? Beyond the Strava app, which we’ve already established is giving me mild test anxiety (in the form of intentional pace increase), is there any other quasi-unobtrusive measurement/assessment possible?
Silent black helicopters, maybe?
Considering the design of a hybrid learning digital (un)obtrusive assessment task delivery and measurement ecosystem supporting the types of learning experiences happening within and around two soaked hikers on a steep-ass trail in the remote corners of a state park: What are some of the xAPI statements that could be generated about this learning activity?
Let’s keep it simple: actor, verb, object, result, context, attachment(s).
What, actually, were the learning events taking place? Off the top of my head, I can think of a few: I was exploring the use of the Strava app for tracking hikes, taking another person to see a location/structure I had seen before, and interpreting the remains of this structure to estimate its age and purpose.
What about my ability to adapt to extreme weather conditions when hiking (without major personal injury or destruction of property), or my attitudinal perseverance through tough conditions, or my awareness of potential risks (such as hypothermia) and preparedness to deal with onset of these potential outcomes? I made a concerted effort to stop and vocalize the fact that we needed to change into dry clothes once we got back to the car, and I also mentioned a technique of quizzing each other with simple math problems in order to keep tabs on our (lack of) cognitive function as in indicator of hypothermia onset.
Also, what about demonstrations of physical fitness, ecological awareness or sense of place, and knowledge of species and interspecific relationships? And let’s not forget the need to differentiate between individual and collaborative demonstrations of learning.
So, in conclusion, considering the actor, verb, object, result, context, attachment(s) properties of an xAPI statement, what might the xAPI statements for these learning activities look like? Also, who (or what) could create them, and when and where could they be stored?
Most importantly, why are they valuable as xAPI statements?