I recently had the incredible good fortune to meet Prof. Joseph Stiglitz, the Nobel laureate. The occasion comprised of Dr Stiglitz having to explain contemporary problems with the Euro to an audience interested in trans-Atlantic economic issues. The great professor had brought several charts and graphs to drive home his points. But his lucid, yet succinct, vivid yet precise, elucidations, illuminations and revelations were just splendidly clear, concise, and crisp. It was just a delightful experience. Not sure how long the lecture and the follow ups lasted in hours but never for one moment I felt the time pass. It tantamounts to a refresher course on monetary economics.
As I got back to work the next morning, I felt the growing desire to share what I had experienced, with others. My eyes glanced over to the Hololens on my desk. Perhaps there was a way to carry over the experience of receiving first hand explanation of an economic concept from the great professor to the folks who did not have the good fortune of meeting the professor in person. I felt that an unguided replay of audio and visual recording would not suffice. Annotations added to such recordings would also serve to distract more than clarify the underlying economic postulates and their application. But immersion into Virtual Reality (VR) amalgamated mode might just do the trick. With VR, the “digital instructor” would be able to “see” what the “student” sees, and guide the experience toward formation of thoughts that represent understanding of the concept. I feel the content can very much be render-able by the device.
This is what I have in mind.
I’d start with the professor’s book, or to keep it simple for this blog, his presentation, consisting of charts and graphs. Let’s assume that this content is in PDF format and the application simply displays the PDF. I’d use Unity as the application’s “game” engine. In the game loop, we would open the PDF (using iTextParser). Initialize Mesh Renderer on the PDF.

string file = "file:///C:/Users/AbidNasim/Downloads/GS-DiversityApp.pdf";
PdfReader reader = new PdfReader(file);
PdfReaderContentParser parser = new PdfReaderContentParser(reader);
ITextExtractionStrategy strategy = new SimpleTextExtractionStrategy();
string currentText = PdfTextExtractor.GetTextFromPage(pdfReader, page /* initialized to page 1*/, strategy);
meshRenderer = this.gameObject.GetComponentInChildren<MeshRenderer>();
var headPosition = Camera.main.transform.position;
var gazeDirection = Camera.main.transform.forward;

Wait for user to position cursor on a specific object.

this.transform.position = hitInfo.point;

We can translate the cursor point to PDF region/rectangle from where we can pick up line or graph.
At this point we can also give user voice assistance, as well as other input (stimuli) such as gestures, etc.

        keywords.Add("Enter Graph", () =>
        {
            this.BroadcastMessage("OnEnterGraph");
        });

        // Tell the KeywordRecognizer about our keywords.
        keywordRecognizer = new KeywordRecognizer(keywords.Keys.ToArray());

        // Register a callback for the KeywordRecognizer and start recognizing!
        keywordRecognizer.OnPhraseRecognized += KeywordRecognizer_OnPhraseRecognized;
        keywordRecognizer.Start();

So, finally, whenever PDF object that is clicked has a corresponding hologram asset, we can render/play it:

    private void EnterGraph(InteractionSourceKind source, int tapCount, Ray headRay)
    {
        var dataPt = GameObject.CreatePrimitive(PrimitiveType.Sphere);
        dataPt.transform.localScale = new Vector3(0.1f, 0.1f, 0.1f);
        var transformForward = transform.forward;
        transformForward = Quaternion.AngleAxis(-10, transform.right) * transformForward;
        audioSource.clip = graphClip;
        audioSource.Play();
    }
    public void OnEnterGraph()
    {
        EnterGraph(InteractionSourceKind.Voice, 1, new Ray());
    }

Well, this is just conceptual so far. I will follow up soon with a proof of concept, as soon as I get my Creator’s update (April 11 or sooner; I am on the Insider track). I’ll share with everyone full code for the proof of concept. Meanwhile, if you’d like to follow up with the POC, have your SDK’s updated and download your Hololens emulator.

Just to reiterate the concept statement. Educational “game” using Hololens is qualitatively different from any other form-factor or medium in that game and the user interact at a much more intuitive level and therefore we can present audio visual information in amalgamated reality that can be extremely effective for learning purpose. And add to that ability to enrich information using holograms. Sky is the limit!