7 Questions on Anthology's Approach to AI

The transformative power of artificial intelligence has led many ed tech vendors to embrace the technology in their product portfolios, from curriculum-building tools to advanced analytics, student engagement, and more. The possibilities are endless, according to Anthology Chief Product Officer JD White, but they must be explored in conjunction with a commitment to ethical AI standards and a keen eye on the student experience. We spoke with White about his company's vision for AI as well as practical considerations for embracing AI tools in the classroom.

Campus Technology: Anthology recently announced a partnership with Microsoft to incorporate generative AI into the Anthology ecosystem of ed tech solutions. Why choose Azure OpenAI?

JD White: Anthology and Microsoft have a long history of partnering together. In fact, it's this history that led Anthology to be the first Education ISV [Independent Software Vendor] invited in Microsoft's OpenAI work. This long-standing partnership, and our shared commitment to responsible AI design, made working together a natural fit.

By integrating Microsoft's AI features into Anthology's purpose-built higher education solutions, we are ensuring our solutions are scalable, secure, and seamlessly integrated into existing educational ecosystems, making them easily accessible to our global client base.

Overall, this partnership will enable our clients to take advantage of more AI-facilitated capabilities within our joint ed tech solutions and unlock new opportunities for personalized student engagement, advanced data analytics, and streamlined administrative processes.

CT: What's possible right now with AI-enabled tools — what are some practical ways institutions can make use of them today?

White: One of the uses we are most excited about is the course-builder aid for instructors in Blackboard Learn from Anthology.

The AI Design Assistant makes Blackboard Learn Ultra the first major LMS to extend generative AI capabilities to instructors and instructional designers, helping them build courses, source images, and create tests and rubrics more efficiently than ever.

The result is less time spent on course creation, and more time with the students. That's really the theme for the AI Design Assistant — inspiration. The tool is available to inspire instructors in course creation.

Practical uses of AI in course creation include:

  • Suggestion of possible course structure based on the context and outcomes of the course. The AI Design Assistant generates a scaffolding of learning modules for the instructor or instructional designer to then review, enhance, and approve.   
  • Identification of royalty-free images based on the context of the course. This capability makes it easy to create visually engaging courses and reduce copyright protection concerns.
  • Streamlining the time-consuming task of creating formative questions leveraging the content or context of a particular learning module. Assessment can be designed to inspire the instructor who can then add inputs and edit as needed.
  • Simplify the creation of rubrics within the existing instructor workflow by generating a draft rubric and aligning it with the specific outcomes for an assignment or discussion.  

As part of our commitment to responsible and ethical use of AI, all of these features were developed following Anthology's Trustworthy AI Approach. Humans are always in control. The institution and the instructor can choose whether to enable and use the features based on their institution's policies and preferences.

CT: Is there a learning curve for educators and administrators in understanding how best to use AI tools and how to interpret their output?

White: Interacting with and creating prompts for generative AI is a skill, and most people are still at the very early stages of developing that skill. The beauty of certain tools like Anthology's AI Design Assistant is that it helps instructors create more engaging learning experiences more quickly without requiring those skills. The capabilities are seamlessly integrated as part of the existing workflows and are available in the context where they are needed.

For example, an instructor may want to create a grading rubric for an assignment. Instructors can continue to create their own rubrics from scratch, but with the AI Design Assistant they now get an option at the point of creation to have Blackboard Learn suggest a grading rubric based on the context of the course and the assignment. The instructor can then review, edit, and decide whether to accept the suggested rubric.

This pattern of easy-to-use contextual suggestions is applied consistently to all the AI Design Assistant capabilities, which is very intuitive and has a very small learning curve.

CT: How do you see AI tools impacting the student experience?

White: It's important to take a careful and thoughtful approach to implementing AI tools across any aspect of the educational lifecycle, but especially as it relates to the student experience. Our initial focus for AI-related tools is centered around inspiring instructors, empowering advisers, and supporting institutional leadership by providing meaningful efficiencies that allow to them to deliver more engaging student experiences with human oversight.

AI technology is still in its infancy. We cannot forget about AI risk factors, such as the potential for bias and inaccuracies. This is why we developed our Trustworthy AI Framework and believe that it's critically important that instructors and faculty are always part of the review and oversight process.

There has been a lot of focus on the inclusion of AI detection in learning management solutions. It is our opinion that reliable detection is not yet a fully viable approach. AI detection can provide a false sense of security and it can disadvantage students with disabilities and those who are learning outside their native language. Instead, an emphasis should be placed on embracing authentic assessment. With AI able to distill existing information with such efficacy, assessment needs to focus on critical thinking, personal perspectives, and self-reflection rather than solely on the accrual of knowledge.

AI has the potential to further reimagine the student experience with improved personalization and empowerment. Institutions can create or design programs that are more adaptive to skill development and employer needs with greater efficiency being responsive to both the learner and the community. 

CT: What does that Trustworthy AI Framework consist of?

White: Anthology has been doing policy work on the ethical use of AI and data since 2018. And, with recent acceleration in generative AI, we released our Trustworthy AI Program and Policy to ensure that approach is instilled in all of our initiatives. The Trustworthy AI Framework has been developed in collaboration with clients, and defines several principles that we adhere to:

  1. The institution, the instructor, and the instructional designer are always in control. They need to explicitly opt into these capabilities.
  2. Any suggestions made by the system need to be explicitly reviewed and accepted by the instructor or instructional designer.
  3. We address copyright and data privacy concerns. No institutional data is used for training models.
  4. AI comes with risk such as potential for bias and hallucinations. We fervently believe that a human always needs to be in the middle to review and accept. AI tools should not be directly used by students without any human intervention or in high-stakes workflows such as grading.
  5. We provide an institutional audit trail and reporting to allow the institution to easily see where AI suggestions have been incorporated.

CT: What is Anthology's vision for what will be possible with AI in the future?

White: Anthology's ongoing vision is to unify data across the siloes of ed tech systems to create data-driven insights that provide a more personalized and guided experience that improves the learner journey. It's a vision we call Anthology Intelligent Experiences (iX). The results are to better position faculty and staff with just-in-time information to promote student success. iX combines data across multiple solutions to help learners align their courses with the skills needed to compete for their desired career and provide them with the badging to showcase mastery of skills to a future employer.

When these intelligent experiences and learning opportunities are infused with AI, instructors and advisers can spend more one-on-one time with the learners, identifying and scaling interventions to students at the time of need. Administrative leaders can query data more effectively by building new models to guide institutions' strategic directions and improve student outcomes, and the learner is empowered to make decisions to author their own education and future. It has the possibility to place the learner at the center of their learning experience.

CT: Where do you see opportunities for ed tech innovation moving forward?

White: Institutions have a wealth of data, yet they often lack the tools and skills to take advantage of it. We envision ed tech solutions that embrace the siloed data instances, align the necessary elements of that data as it relates to a specific student or specific job function, and deliver in-product insights that are personalized and guided to the task at hand. This approach is a huge opportunity that can exponentially benefit institutions around the world. Students, faculty, and staff need the insights that the data can provide, but they don't always have the skill sets to build the necessary reports or even know exactly what elements of the data exist within the ed tech solution they are using. The potential is limitless as the solutions can become more intelligent and help guide the user based on personalized needs and goals.

We often hear about the lack of alignment between education training and the real-world skills required to make a new student hire effective in their job from day one. Today's learners are expected to enter the workforce and hit the ground running, but there is a disparity between the skills they gain in the classroom and the skills employers need in the workforce. Ed tech innovation has an opportunity to help institutions pivot with the pace of workforce demand in real time, aligning those ever-changing skills requirements to the courses being provided. This might also include showcasing to institutional leaders areas where they need to expand course material or subject matter to better align their resources with the latest job market requirements. Imagine a world where all the new jobs created by AI that don't yet exist can be identified on a moment's notice, and your institution can adjust course to align just as quickly. Not only can the intersection between the instruction and workforce skills become better aligned, but there is potential to establish industry-wide alignment on the certifications or badging that showcase mastery of that skill across institutions. This type of centralized skill recognition could ensure that an employer, as well as student, is on the right track from school to workforce.

Ed tech innovation is on the cusp of major change. Those who embrace innovation will be able to offer new ways to accelerate institutional efficiency, promote retention and student success, and deliver the tools available to instill a lifelong passion for learning.

Featured

  • pattern featuring interconnected lines, nodes, lock icons, and cogwheels

    Red Hat Enterprise Linux 9.5 Expands Automation, Security

    Open source solution provider Red Hat has introduced Red Hat Enterprise Linux (RHEL) 9.5, the latest version of its flagship Linux platform.

  • glowing lines connecting colorful nodes on a deep blue and black gradient background

    Juniper Launches AI-Native Networking and Security Management Platform

    Juniper Networks has introduced a new solution that integrates security and networking management under a unified cloud and artificial intelligence engine.

  • a digital lock symbol is cracked and breaking apart into dollar signs

    Ransomware Costs Schools Nearly $550,000 per Day of Downtime

    New data from cybersecurity research firm Comparitech quantifies the damage caused by ransomware attacks on educational institutions.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Garners OpenAI Support

    ChatGPT creator OpenAI is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.