LDT Seminar – Final – Fall Quarter Reflection

Assignment

  1. Evaluate your own contributions to seminar based on this rubric.
  2. Explore your own learning inside and outside of class in a brief reflection paper (1-2 pages).  

Below Expectations

Meets Expectations

Exceeds Expectations

Attendance*

Misses two or more seminars. Comes late or leaves early. Does not inform instructor of absence in advance.

Attends all of seminar, or misses one, with very good excuse (e-mailed to instructor ahead of time). Always on time.

Organizes extra learning opportunities for other learners.

Assignments**

Assignments are late, incomplete, or poorly executed.

Assignments are turned in on time. All outside work is turned in on time (or ahead of time). Assignments address the assignment components, but appear rushed or have errors.

In-class and out-of-class assignments are completed thoughtfully and thoroughly. In out-of-class work, attention is paid to content, spelling, grammar, and flow.

Participation

Rarely speaks, or rarely listens. Carries on side conversations or other off- topic activities (for example on the computer).

Mostly listens, but speaks sometimes. Or mostly speaks, but listens sometimes.

Speaks and listens actively in class. Builds on the ideas of others. Challenges own thinking and that of others. Seeks to make connections between concepts in class and to outside experiences.

Response

Self evaluation:

Attendance

Meets Expectations – Attends all of seminar, or misses one, with very good excuse (e-mailed to instructor ahead of time). Always on time.

The reason why I did not evaluate myself as “Exceeds Expectations – Organizes extra learning opportunities for other learners” is because I did not have the time or put in the effort to help my peers. Maybe if I was aware of this evaluation at the beginning of the quarter, I would have attempted to do so. Yet I see that I had the opportunity to teach Reuben some iOS development – which he expressed interest in. I simply did not have the time to do so. I expect that next quarter I might not have the time either but will make an effort to be more aware of these teaching/knowledge sharing opportunities. To counter this argument though, I’ve maintained a blog (www.lucaslongo.com) where I registered all of my reading and class notes as well as all the assignments I created during the quarter. Even though the blog’s main purpose was to document my work and have a repository for future consultation, I believe that my peers could potentially benefit from these notes. 

Assignments

Exceeds Expectations – In-class and out-of-class assignments are completed thoughtfully and thoroughly. In out-of-class work, attention is paid to content, spelling, grammar, and flow.

With the exception of the 3 out of the 4 final papers for Tech 4 Learners, all my assignments were turned in on time and thoughtfully created. I must say that I have never studied so attentively or engaged so deeply with the content while in an educational environment. 

Participation

Exceeds Expectations – Speaks and listens actively in class. Builds on the ideas of others. Challenges own thinking and that of others. Seeks to make connections between concepts in class and to outside experiences

My participation in class is a way in which I learn. Exposing my ideas and thoughts in class help me validate my understanding of the content. I see it as a technique to engage with the content and to stimulate myself to challenge what I already know. I honestly cannot remember a class in which I did not make a contribution. 

Explore your own learning inside and outside of class in a brief reflection paper (1-2 pages):

These past 10 weeks of classes have been the most intense learning experience I’ve ever had. The quarter system provides a sense of urgency and speed in absorbing the material that a semester system leaves lax. There is no opportunity to catch up – if you leave the ball drop, it seems impossible to recover. The fact that I was taking 18 units distributed amongst 7 classes also contributed to this feeling of a massive knowledge dump into my brain. Yet I feel that the teaching quality and pedagogical level of the course delivery was key in making this all possible. I surmounted the task and feel like I am definitely more knowledgeable after this quarter. 

One of the main reasons why I chose to come to Stanford and go for the LDT Master’s program was to understand how education really works, how we learn, how to teach, and what one must consider in diving into the complex task of education. This quarter showed me that I came to the right place. It also showed me that education is much more complex than I previously knew. My respect towards K-12 teachers grew exponentially as the quarter went by, along with my amazement and incredulity that the profession is not valued as the most challenging of them all. How is such a vital role in our society be undervalued in most cases? Why do teachers, who have the most profound effect on our children’s future, be one of the lowest paying professions around? Education is the one thing that no-one can take away from us. 

Reflecting upon each course I took, I can say that each minute spent in classroom, each line of text read, each group discussion, and each assignment completed added to what I desired to learn and to the way I see the world. Let’s go through each course to illustrate the main take aways:  

Topics in Brazilian Education

Even though I was born and raised in Brazil, I never attended the Brazilian educational system. When I was 3 years old I started attending the American school in São Paulo. When I was 12, I moved to Italy where I attended the British school in Milan. Back in São Paulo, a year later, I continued onto the British school in São Paulo. Undergraduate studies – Rensselaer in Troy, NY. Graduate school – NYU. Now Stanford. 

This created a vacuum in my knowledge about the Brazilian educational system and its history. This class was an eyeopener in terms of what has happened in recent history in Brazil and what still needs to be done. Even though the course was the least organized of all of my courses this quarter, it showed me that public education in Brazil is an afterthought for the government. Huge investments were made in higher education, but K-12 was marginalized. The feeling is that kids go to school to get fed and so that the parents can go to work and receive financial aid from the government. 

I now understand why the Lemann Fellowship exists. It’s stated mission is to improve Brazilian public education by providing funds to those who get into the top schools in the world. It previously seemed like an altruistic move but it’s more of a real and endearing necessity for Brazil. 

Introduction to Teaching

This course presented me with the formal techniques and considerations teachers must attend to in their profession. I was amazed at how complex teaching really is – especially at the K-12 level where teachers must not only have PCK, but must also differentiate between student cognitive levels and cultural backgrounds, manage the classroom behavior and dynamics, and perform formative assessment continuously – all at the same time. Teachers are my new heroes. Reading through Lampert’s “Teaching Problems and the Problems of Teaching” shows how complex teaching simple math division can be. It goes down to the choice of what number should be presented in an exercise. It requires planning, constant evaluation, and thinking on your feet constantly to ensure the learning objectives are met. 

The wealth of terminology learned in this class was also extremely helpful. I knew nothing about teaching before this class. I must confess I had either never heard of or did not know the full meaning of the terms we covered in class: didactic/direct instruction, facilitation, coaching, ZPD, transfer, metacognition, prior knowledge, scaffolding, APA Style, Bloom’s Taxonomy, modeling, guided practice, PCK, differentiation, formative assessment, summative assessment, the black box, teach for the test, learning progression, rubric, formal and informal learning environments, funds of knowledge, and teacher professional development. Wow… I can’t believe how much I’ve learned from this one class. Truly amazing. 

Tech 4 Learners

The main takeaway from this class was the danger of the technocentric view of education – which I must admit I suffered from. I came to LDT with a notion that I would be able to get all of my school’s content, put it online, and only need the teacher once I had to update the course content. This course showed me that a human teacher and human peers interacting in real life are essential for effective learning to take place. I definitely now see that MOOCs by themselves are not the way to go – there must be a component of human interaction, of peer communication, and of timely commitment towards a final learning objective. 

In parallel, this class gave me the opportunity to work once again with children with special needs. While at ITP, I took a course called “Inclusive Game Design” where we created a game for a child with cerebral palsy. It was one of the most rewarding experiences I’ve had in designing a tool. To see the child interact with the game in the way we intended was simply breathtaking. This was repeated in this class where our rapid prototypes were able to evolve and adapt towards our goal of helping our learner. 

Terminology and concepts acquired from this class: backwards design, technocentrism, growth mindset and the perils of praise, four-phase model of interest development, joint media engagement, the protégé effect, and tangible user interfaces. 

Understanding Learning Environments

This course provided me with the foundations of learning theory and cognitive development along with the main theorists of our times. The most interesting concept for me was Lave & Wenger’s Legitimate Peripheral Participation concept and the notion that learning is what happens in the interaction of masters, apprentices, their actions, and the environment/context in which they are situated. It was interesting to see how much education is based on psychology, philosophy and cognitive development – something I can now see as obvious. I would have had to ask for elaboration if someone told me so in the past. Having read, even if extremely little of, Skinner, Piaget, Montessori, Vygotsky, Dewey, Freire, and several others gave me confidence to talk about education in a more meaningful manner. 

Introduction to Qualitative Research Methods

This was yet another course that presented me with completely new knowledge. Being an engineer and working with software for most of my life, research was never something present – not to mention qualitative research. My initial reaction to this course was “wow, I can get a job that entails observing the real world in extreme detail and then writing about it in the most interesting manner possible!?” I was thrilled to learn that this kind of research even existed. It gave me a framework for looking at the world, to understanding bias, creating interview questions, capturing data, analyzing it and presenting it. It made me think about writing effectively based on evidence, creating propositions, elaborating theories, and extracting meaning. 

Key concepts: I as a camera, turtles all the way down, grounded theory, probing, coding, propositions and validity. 

LDT Seminar

This course made me reflect primarily on the reasons why I came to LDT. What is the problem I want to look at? Is it a real problem? Does it matter? How do I define the problem? It made me understand the importance of reading research, how to research and follow the reference sections for even further reading. It made me talk to experts and learners to understand what has already been done and what still needs to be done. 

It also made me appreciate my diverse and profoundly interesting cohort. How much everyone brings to the table. It left me wanting to know them better and more intensely. It showed me that we can’t always do it ourselves and that collaborating can generate something that is invariably greater than the sum of the parts. It made me think about my role in society and in the immediate community that I am living in. 

Human-Computer Interaction Seminar

This was the course that had the least impact of all this quarter, simply because of its lecture format with no group discussion or interaction – only quick Q&A sessions at the end of each session. The quality of the lecturers and the content presented was amazing though. The most memorable ones were: 

  • Wendy Ju: Transforming Design: Interaction with Robots and Cars
  • Janet Vertesi: Seeing Like a Rover: Visualization, embodiment, and teamwork on the Mars Exploration Rover mission
  • Sean Follmer: Designing Material Interfaces: Redefining Interaction through Programmable Materials and Tactile Displays

All in all this was an intense quarter which presented me with a wealth of knowledge I had never experienced before. I am extremely pleased with my decision in coming to LDT and am anxious for the next 3 quarters. I always say that I could stay in school forever. Somehow I feel that it is up to me to find a way to do so – maybe not by getting a third Masters degree or dive into a PhD (for now) – but get involved in a company, research group or organization where my thirst for learning is continuously fed.

Tech 4 Learners – Final – Learning Tool Evaluation

Assignment

Choose any digital learning tool currently on the market.  Explore it, poke at it, twist it and see if you can break it (in a pedagogical sense, not a technical one).  When you have a good sense of what it does, write a description of the tool, including the intended learners, content, and approach to learning.  What are its strengths and weaknesses?  How should it be evaluated? How could it be improved or extended?  2-3 pages

Response

Synopsis

Udemy is an online course marketplace who’s mission is to “help anyone learn anything” according to their website which also states that every course is “available on-demand, so students can learn at their own pace, on their own time, and on any device.” The platform caters to learners and businesses offering over 35,000 courses ranging from photography to mobile development. At it’s core, Udemy offers an online course publication tool that allows instructors to create their courses and put them up for sale both on Udemy’s marketplace and the instructor’s own website. The instructor sets the selling price and shares the revenues with Udemy at varying rates, depending on who initiated the sale. The instructor keeps 97% of the revenue if the sale originated from their own website and 50% if the sale originated from Udemy’s website.

Learning

Besides the obvious focus on the students, Udemy has a significant focus on the instructor, offering several resources to aid instructors in creating courses. To start with, Udemy offers a free “How to Create Your Udemy Course” which utilizes the platform itself to deliver it. A support website is also available offering several articles such as “Getting Started: How do I create my Udemy Course?”. There is also a closed Facebook group is available for instructors to share experiences, get help and learn from each other. These resources focus on planning, producing, publishing, and promoting the instructor’s courses.

The designers seem to believe that instructors, as learners of the tool, need to understand how teaching online is different from teaching in a classroom. The support material focuses on guiding the instructors on best practices, media quality, and pedagogical styles that best work in this environment. On top of these resources prior to creating a course, Udemy enforces a course review process once the course is ready. This process entails a detailed inspection of the quality of the media, the course content organization, as well as the frequency of different media utilized. For example, a course with only text, only slides, or only videos – will be rejected. A mix of media, quizzes, and presentation styles is therefore valued by Udemy as essential for the learners (students) to succeed.

Content

Judging by the content presented, designers see as barriers to developing an understanding of the subject matter is course planning and digital literacy. Starting with guiding the instructors on learning objectives and general planning of the course, the designers offer basic pedagogical knowledge. Moving on to the production of the course, the designers offer detailed instructions and specifications on audio and video size and quality as well as filming and editing tips, for example. Publishing instructions are also offered guiding instructors on pricing strategies, free course previews and other information about how to make the course more attractive to students. Finally, Udemy provides suggestions on how to drive sales of the course.

Following the content on creating the course, Udemy continues with guides on how to utilize the tools they offer on their digital platform. A strict “Course Quality Checklist” is presented as well as the “Udemy Studio Code of Conduct” which details what is allowed, encouraged as well as what is frowned upon. Interestingly enough, the last session in the “Udemy Teach” section of their site includes “Coding Exercises” which talks about how to create exercises, validate and checking student’s code, and a few example exercises for Javascript, Html, and CSS. This shows a tendency of online courses be heavily geared towards programming courses. My personal guess to why this happens: programming instructors and students have a higher digital literacy and comfort around technology. It is probably harder to find a tech savvy Yoga instructor that publishes an online course as it is to find a Yoga student looking for a strictly online course on Udemy. A quick search shows 155 Yoga courses versus 557 ‘programming’ courses along with 683 ‘development’ courses. Times are changing. 

Technology

The features the designers are leveraging in this implementation revolve around cloud storage and Ajax. Cloud storage means that all the content is uploaded to Udemy’s platform and stored in their environment – including videos – for no extra charge to the instructor. This allows complete control of the content and delivery quality of the courses. Ajax is a ‘modern’ technique of creating web pages that allow dynamic loading of content, draggable elements, and addition of new sections without the necessity of reloading the page. This provides a fluid and intuitive interface that makes the job of creating the course content actually pleasurable.

On the student’s end, the interface is also intuitive, clean, and easy to use. Each section of the course is presented without distractions and provides clear actionable items to control the playback.

Assessment

The success of this tool is publicized on their web page with numbers such as 9 million students, 35 thousand courses, 19 thousand instructors, 35 million course enrollments, 8 million minutes of video content, and 80 languages. Although these are all big numbers by any standard, I would also be interested in looking at the following numbers:

    1. Growth rate of the number of instructors joining the platform
    2. Time between account creation and course publication
    3. Average number of courses published per instructors
    4. Average revenue per course
    5. Course completion rates by students

I would also be very interested in interviewing instructors who have published courses on other platforms to understand what Udemy’s course publication tool is doing right or wrong. From personal trial and error, I’ve found Udemy’s interface the easiest to use and the one that provides the most scaffolds for the instructor. Their review process is also extremely helpful with attention to minimal details showing that there are actual humans reviewing the course content. This ensures course quality for the students and gives the company a high level of credibility as well as showing their care towards the learner.

Evaluation

Scale: 0 – Absent, 1 – Minimal, 2 – Strong, 3 – Exemplary

The tool is making effective use of unique features of this technology.

2 – Strong: Udemy’s uses the latest Html techniques to provide a good user experience. I would have judged it exemplary if there were a drawing tool embedded in the platform – something like a white board that would record my strokes and voice over from within the tool.

The features of the tool demonstrate an understanding of the target learner.

3 – Exemplary: Udemy course publication tool is setup in a way that it asks for information from the instructor in a structured and familiar manner using terminology commonly used by teachers such as course goals, course summary and other features one would expect in a pedagogical tool.

The design of the tool suggests an understanding of the challenges unique to learning the target content.

3 – Exemplary: Udemy’s wide variety of content, tools and possible interactions amongst instructors show a great care towards the main driver in education – the instructor. They understand that teachers, educators, and subject matter experts may not have all the TPACK necessary to become an online instructor. To supplement this, they try to provide content in various formats with several examples and support for them.

Tech 4 Learners – Final – Pedagogical Compass

Assignment: 

A compass is a tool that allows us to orient to different directions, charting a course toward whatever destination we have chosen. For this section of the final, you will create a tool to organize the different concepts covered in this class.  In addition, it should allow us to “map” different EdTech products in some way as well as to guide those who use it toward effective learning experiences.
This is a very open-ended assignment, which can be creatively interpreted.  Use it to push your thinking about how these concepts fit together! It can be visual, or even physical, if you so choose.  It can take the form a concept map or a chart or a poem, or any number of other forms. We expect it to fit into 1-2 pages. However you choose to present your thinking, it is important to clearly convey what is particularly important to you.  In this assignment you are articulating your position on the pedagogies conveyed by the readings and the course concepts.

It is important that you:

  1. provide some explanation to the user of the compass as to what it means,
  2. reflect on the relationships between the different concepts,
  3. provide references (citations) to scholarship, so that users of the compass can pursue further enlightenment,
  4. articulate which way is “North” to you, and why?

Response: 

Screen Shot 2015-12-14 at 8.01.29 PM.png

Tech 4 Learners – Final – Advice to a Future Learning Tool Designer

“Activities, tasks, functions, and understandings do not exist in isolation; they are part of broader systems of relations in which they have meaning. These systems of relations arise out of and are reproduced and developed within social communities, which are in part systems of relations among persons. The person is defined by as well as defines these relations. Learning thus implies becoming a different person with respect to the possibilities enabled by these systems of relations. To ignore this aspect of learning is to overlook the fact that learning involves the construction of identities.” (Lave and Wenger, 1991, Ch. 2)

Regardless of your current profession and experience, you have been impacted by education and technology. As we progress in our society, we must think how might we deliver the best educational content, implement the most effective teaching methodologies (pedagogy), and the utilize tools that engage both learners and educators in meaningful learning experiences. Education is one of the most complex issues in our society and has been since the beginning of civilization. Without education, how does a community, a company,  a country, and the human race progress? This paper, along with the Pedagogical Compass (https://prezi.com/zgdhgwrlealw/) will present an overall view of who are the stakeholders, how education happens for educators, how learning happens, and what might we select as relevant content for the future.

Even if you are not directly involved in education, you certainly have faced the need to teach someone, explain how something works, train a new employee, present your research, your work, or your thoughts. With this in mind, we propose to look at educational tools with a set of lenses that might provide an encompassing view when designing effective learning tools. The Pedagogical Compass looks at what we teach (North), how we teach (South), how we learn (East), and who we learn from (West). Through these four cardinal positions we might facilitate and hopefully stir your thought processes based on current research, learning theories, and experiments done in the field.

If we look at user experience designers, we generally consider a tool’s graphical layout, the affordances provided by the tool, it’s usability or ease of use, and finally the service and/or outcome the tool offers. Game designers go a step further in looking at how the user repetitively engage with the tool, reward systems, and how the gamer learns and progresses in the gameplay. One effective framework to use is the “Core Loop” which looks at every step of engagement one has within a game. It involves a cycle which starts with 1) assessment of the current scenario, 2) choosing the correct action, 3) aiming your action appropriately, 4) launching your action, 5) being rewarded (or not) by the consequences of your action. Once rewarded, you go back to step 1 where you assess your next move. By identifying the elements in each of the loop’s nodes we are able to better visualize the process and hopefully improve it. What happens between these nodes should also be considered in order to change the speed of the loop’s cycle.

This approach can be particularly useful in designing a learning tool. The learner, when engaging with new content or knowledge that must be acquired, will first assess what is known, what resources are available and what needs to be achieved. Second step is to choose a potential approach to absorbing the content such as reading, taking notes, and discussing the subject matter with colleagues. Once the action is chosen, one must aim at the appropriate content to engage with, launch your action and finally be rewarded by learning, understanding, and/or comprehending the content. We then continue back to the first step where we assess once again what we know, what we should do, how to apply it, take action, and be rewarded by the results. Yet designing a learning tool is not limited to the learner’s core loop. Learning happens to someone, within a social and cultural context, setup by a teacher, guide, or environment, and the interactions of these elements.

Going back to our Pedagogical compass, let’s first look at what we teach. Is it useful teaching quantum physics to a learner who’s talents lie primarily in the artistic realm? Will a certain content be helpful to get a job or to function better in society? It seems more than plausible to “focus what and how we teach to match what people need to know” (NETP, 2010). Therefore, when designing a learning tool, we must first consider the ultimate goal – the learning objective and outcomes. This approach has been coined by Walters & Newman, 2008 as backward design:

“This backward approach encourages teachers and curriculum planners to first think like an assessor before designing specific units and lessons, and thus to consider up front how they will determine whether students have attained the desired understandings.” (Walters & Newman, 2008)

“One starts with the end—the desired results (goals or standards)— and then derives the curriculum from the evidence of learning (performances) called for by the standard and the teaching needed to equip students to perform.” (Walters & Newman, 2008)

By preemptively defining how we will evidence the intended learning, we might do a better job when designing and refining each step and activity along the learning/teaching experience.

Now that we have our learning plan in place, how might we effectively transmit this to our learners? How do we teach more effectively? Can we simply use technology to do so? Can we eliminate the teacher from the process? This technocentric approach, where one believes that technological tools alone will transfer knowledge to students is widely criticized. Yet technology makes us rethink education and the role of the teacher in a more profound way:

“Combating technocentrism involves more than thinking about technology. It leads to fundamental re-examination of assumptions about the area of application of technology with which one is concerned: if we are interested in eliminating technocentrism from thinking about computers in education, we may find ourselves having to re-examine assumptions about education that were made long before the advent of computers. (One could even argue that the principal contribution to education made thus far by the computer presence has been to force us to think through issues that themselves have nothing to do with computers.) ” (Papert, 1987)

Therefore we must not only look at the tool but how we use it, and how we interact with the learners when engaging with the content. Learning is a continuous process, a technique acquired that will leverage further and future learning – learning how to learn. Learning that it is possible. One is not born with a certain and immutable level of intelligence. Believing this fixed notion of intelligence is potentially harmful and limits learners to put in the effort into the task. If the learner believes that progression is not possible, it becomes a self-fulfilling prophecy. The simple act of praising or criticizing one’s ‘intelligence level’ instead of nurturing the process of learning may prevent learners from having a ‘growth mind-set’ and promoting self-guided interest in development of one’s knowledge base:

“I think educators commonly hold two beliefs that do just that. Many believe that (1) praising students’ intelligence builds their confidence and motivation to learn, and (2) students’ inherent intelligence is the major cause of their achievement in school. Our research has shown that the first belief is false and that the second can be harmful—even for the most competent students. ” (Dweck, 2007)

“Understanding that interest can develop and that it is not likely to develop in isolation is essential. Further articulating the contribution of interest to student learning and its relation to other motivational variables has potentially powerful implications for both classroom practice and conceptual and methodological approaches to the study of interest. ” (Hidi & Renninger, 2006)

Now that we have glossed over what we should learn and how we might teach it, we can look at how do we actually learn. Learning is a natural and innate process. We learn how to speak, how to walk, how to interact with our environment, and how to behave in society. We learn not only in formal environments such as schools and training centers, but also in the interaction with others. If we look at children playing video games, research shows that they are naturally learning how to play the game, how to collaborate, and interact with each other with the goal of enriching their experience:

“For these reasons, we do not appeal to the games-are-highly-motivating explanation, but we do see a reason that young people play games and get them tangled up with the rest of their lives, and this reason is cultural. The phrase that best helps us explain it comes from one of our participants, Mikey, who in talking about games said, “It’s what we do.” The “we” he was referring to was kids these days, the young people of his generation.” (Stevens, Satwicz, McCarthy, 2008)

Another powerful concept is that we learn by teaching. What better way to understand a concept but to explain it to someone else? Not only must we utilize our metacognition to access the key elements, but we must articulate in a clear manner so that others can grasp the knowledge at hand. On top of that, humans naturally seem to care more about helping others than helping themselves. An increased level of responsibility and engagement with the content when teaching others is tapped into – it’s called the Protégé Effect. The research looked at how children taught a Teachable Agent (TA) and how this affected their own content acquisition.

“We then introduce TAs, which combine properties of agents and avatars. This sets the stage for two studies that demonstrate what we term the protégé effect: students make greater effort to learn for their TAs than they do for themselves.” (Chase, Chin, Oppezzo, Schwartz, 2009)

“Given our hypothesis that the protégé effect is due to social motivations, we would expect students in the programming condition to be less inclined to acknowledge ” (Chase, Chin, Oppezzo, Schwartz, 2009)

Finally, but not less importantly, we should look at who we learn from, beyond the teacher in it’s most traditional definition. Research shows, along with our common knowledge, that we learn from our peers, from our environment, from the media that we consume and the interactions we engage in while doing so: Joint Media Engagement – the new co-viewing (Takeuchi and Reed Stevens, 2001):

“The variety of ways that we saw young people arrange themselves to play games surprised us, especially since most of these ways were interpersonally and emergently organized by the young people themselves.” (Stevens, Satwicz, McCarthy, 2008)

“Parents, teachers, and other adults may wish to share educational resources with their children, but teaching with media and new technologies doesn’t always come naturally, not even for experienced instructors. Provide guidance for the more capable partner in ways that don’t require a lot of prior prep or extra time, actions that can help ensure that the intended benefits of the resource are realized. ” (Takeuchi and Reed Stevens, 2001)

With this is mind, the role and actions of the teacher is greatly expanded and complicated since it must consider not only what is happening inside the classroom but also outside the classroom. Engaging students, triggering and maintaining their interest in the content is a great challenge that can be modeled by the Four-Phase Model of Interest Development developed by Hidi & Renninger, 2006, which looks deeply into how interest progresses from an initial casual level of engagement to a more deeply involvement with the subject matter, where the teacher’s role is to provide positive feelings towards the content, generate curiosity to encourage further research, provide opportunities for learning by offering content and pointers towards meaningful resources, and a guide on research to enable learning progression. By providing this, the interest level of a learner will move from Triggered Situational Interest to Maintained Situational Interest to Emerging Individual Interest and finally to a Well Developed Individual Interest.

Designing learning tools might be the most complex challenge we face in our society, not only from a pedagogical standpoint. We must look at the scalability of teaching, content relevance, socio-cultural implications, cognitive developmental stages, interaction with peers, policy, assessments, teacher professional development, costs, and implementation – to list a few. We invite you to become part of this ever evolving field, take on the challenge of creating a better future for humanity, develop, implement and research how might we help spreading knowledge across the world in an effective, considerate and meaningful way. We need designers, teachers, engineers, developers, psychologist, philosophers, doctors, lawyers, leaders, and anyone with a desire and drive to share knowledge and improve the tools we have to do so.

Tech 4 Learners – Final – Notes

Reading Notes

National Education Technology Plan

  • Focus on technology but need to use it for PD
  • Focus Areas:
    • Learning
    • Assessment
    • Teaching
    • Infrastructure
    • Productivity

Understanding by Design

  • Backwards design or backwards planning
  • Clear learning objectives
  • How could we incorporate game design practices into education?

Computer Criticism vs. Technocentric Thinking

  • Ed Tech is not the silver bullet – must come with pedagogy and PD

In-Game, In-Room, In-World

  • Kids learn plenty from each other
  • Kidification of education

 The Perils and Promises of Praise

  • Growth mindset
  • Constructive praise – effort and process not ability itself (you’re so smart!)

Four-Phase Model of Interest Development

  • Model
    • Triggered Situational Interest
    • Maintained Situational Interest
    • Emerging Individual Interest
    • Well Developed Individual Interest
  • Teacher’s interest is probably best predictor of effective teaching
  • Teacher’s role is to provide:
    • Positive feelings
    • Generate curiosity
    • Provide opportunities
    • Guide on research

 The New Coviewing: Joint Media Engagement

  • Design Guide
    • Mutual engagement
    • Dialogue inquiry
    • Co-creation
    • Boundary crossing
    • Intention to develop
    • Focus on content, not control
  • Challenges
    • Parents too busy
    • Parents unaware of needs
    • Don’t enjoy the same content
    • Desired interactions not always triggered
    • Little continuity into other family activities
    • Distraction are always present
  • Design principles
    • Kid driven
    • Multiple plains of engagement
    • Differentiation of roles
    • Scaffolds to scaffold
    • Trans media storytelling
    • Co-creation
    • Fit
  • “What goes on between people around media can be as important as what is designed into the media”

Teachable Agents and the Protégé Effect

  • Care more about pleasing others than oneself, so having someone you need to help enhances learning through teaching this person

 Tangible Bits: Beyond Pixels

  • Tangible User Interfaces

 Horizon Reports

  • re-teaching our teachers how and what to teach

Paper Planning

Pedagogical Compass

North – what we teach

  • Content relevance
    • “focus what and how we teach to match what people need to know ” (NETP, 2010)
    • “It leverages the power of technology to provide personalized learning and to enable continuous and lifelong learning. ” (NETP, 2010)
    • “Build tools and experiences that revolve around a child’s existing interests, not just prescribed topics. To do so, producers need to design mechanisms that make children’s interests visible and can assist adults in responding to them. ” (Takeuchi and Reed Stevens, 2001)
    • “Joint media engagement can be a useful support for developing literacy, including basic reading ability, cultural literacy, scientific literacy, media literacy, and other 21st century skills.” (Takeuchi and Reed Stevens, 2001)
  • Assessment
    • “technology-based assessments can provide data to drive decisions on the basis of what is best for each and every student and that, in aggregate, will lead to continuous improvement across our entire education system. ” (NETP, 2010)
    • “This backward approach encourages teachers and curriculum planners to first think like an assessor before designing specific units and lessons, and thus to consider up front how they will determine whether students have attained the desired understandings.” (Walters & Newman, 2008)
  • Teacher’s interest
    • Teacher’s interest is probably best predictor of effective teaching – Lucas

 South – how we teach

  • Teacher Professional Development
    • “Professional educators are a critical component of transforming our education systems, and therefore strengthening and elevating the teaching profession is as important as effective teaching and accountability. ” (NETP, 2010)
  • Curriculum construction
    • Backwards design or backwards planning – Clear learning objectives
      • One starts with the end—the desired results (goals or standards)— and then derives the curriculum from the evidence of learning (performances) called for by the standard and the teaching needed to equip students to perform. ” (Walters & Newman, 2008)
  • Using technology wisely
    • “Assigning roles to participants so that tasks and content match up to individual maturity is another way of ensuring that everyone is suitably challenged and/or entertained.”  (Takeuchi and Reed Stevens, 2001)
    • “Parents, teachers, and other adults may wish to share educational resources with their children, but teaching with media and new technologies doesn’t always come naturally, not even for experienced instructors. Provide guidance for the more capable partner in ways that don’t require a lot of prior prep or extra time, actions that can help ensure that the intended benefits of the resource are realized. ” (Takeuchi and Reed Stevens, 2001)
    • “Mark Weiser’s seminal paper on Ubiquitous Computing [54] started with the following paragraph:
      “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.”
      I do believe that TUI is one of the promising paths to his vision of invisible interface. ” (Ishii, 2008)
    • “Combating technocentrism involves more than thinking about technology. It leads to fundamental re-examination of assumptions about the area of application of technology with which one is concerned: if we are interested in eliminating technocentrism from thinking about computers in education, we may find ourselves having to re-examine assumptions about education that were made long before the advent of computers. (One could even argue that the principal contribution to education made thus far by the computer presence has been to force us to think through issues that themselves have nothing to do with computers.) ” (Papert, 1987)
  • Student feedback
    • Praise effort and not ability:
      • “I think educators commonly hold two beliefs that do just that. Many believe that (1) praising students’ intelligence builds their confidence and motivation to learn, and (2) students’ inherent intelligence is the major cause of their achievement in school. Our research has shown that the first belief is false and that the second can be harmful—even for the most competent students. ” (Dweck, 2007)
      • “Maybe we have produced a generation of students who are more dependent, fragile, and entitled than previous generations. If so, it’s time for us to adopt a growth mind-set and learn from our mistakes. It’s time to deliver interventions that will truly boost students’ motivation, resilience, and learning. ” (Dweck, 2007)
  • Managing motivation and student interest
    • “In fact, teachers often think that students either have or do not have interest, and might not recognize that they could make a significant contribution to the development of students’ academic interest (Lipstein & Renninger, 2006)” (Hidi & Renninger, 2006)
    • “In general, findings from studies of interest suggest that educators can (a) help students sustain attention for tasks even when tasks are challenging—this could mean either providing support so that students can experience a triggered situational interest or feedback that allows them to sustain attention so that they can generate their own curiosity questions; (b) provide opportunities for students to ask curiosity questions; and (c) select or create resources that promote problem solving and strategy generation. ”  (Hidi & Renninger, 2006)
    • “Understanding that interest can develop and that it is not likely to develop in isolation is essential. Further articulating the contribution of interest to student learning and its relation to other motivational variables has potentially powerful im- plications for both classroom practice and conceptual and methodological approaches to the study of interest. ” (Hidi & Renninger, 2006)
  • Trends (NMC Horizon Reports)
    • Blended Learning
    • Open Educational Resources
    • Digital Literacy
    • Integrating Technology in Teacher Education
    • Rethinking Roles of teacher

 East- how we learn

  • Access to education
    • “The underlying principle is that infrastructure includes people, processes, learning resources, policies, and sustainable models for continuous improvement in addition to broadband connectivity, servers, software, management systems, and administration tools.” (NETP, 2010)
  • Growth mindset
    • “Other students believe that their intellectual ability is something they can develop through effort and education. They don’t necessarily believe that anyone can become an Einstein or a Mozart, but they do understand that even Einstein and Mozart had to put in years of effort to become who they were.” (Dweck, 2007)
  • Learn from culture
    • “For these reasons, we do not appeal to the games-are-highly-motivating explanation, but we do see a reason that young people play games and get them tangled up with the rest of their lives, and this reason is cultural. The phrase that best helps us explain it comes from one of our participants, Mikey, who in talking about games said, “It’s what we do.” The “we” he was referring to was kids these days, the young people of his generation.” (Stevens, Satwicz, McCarthy, 2008)
  • Four-phase model of interest development
    • Triggered Situational Interest
    • Maintained Situational Interest
    • Emerging Individual Interest
    • Well Developed Individual Interest
  • Learn by teaching – protégé effect
    • “We then introduce TAs, which combine properties of agents and avatars. This sets the stage for two studies that demonstrate what we term the protégé effect: students make greater effort to learn for their TAs than they do for themselves. ” (Chase, Chin, Oppezzo, Schwartz, 2009)
    • “Given our hypothesis that the protégé effect is due to social motivations, we would expect students in the programming condition to be less inclined to acknowledge ” (Chase, Chin, Oppezzo, Schwartz, 2009)

 West – who we learn from

  • Technocentric views
  • Learn from peers
    • “The variety of ways that we saw young people arrange themselves to play games surprised us, especially since most of these ways were interpersonally and emergently organized by the young people themselves. ” (Stevens, Satwicz, McCarthy, 2008)
    • “In fact, shared attentional focus on media in real time is a powerful interactional resource not found in most contemporary asynchronous social media, and researchers across a range of disciplines highlight the importance of joint attention for learning and meaning- making (e.g., Barron, 2000, 2003; Brooks & Meltzoff, 2008; Bruner, 1983, 1995; Goodwin, 2000; Meltzoff & Brooks, 2007; Stevens & Hall, 1998; Tomasello, 1999, 2003). ” (Takeuchi and Reed Stevens, 2001)
    • “Stevens, Satwicz, and McCarthy’s (2008) naturalistic studies of siblings and friends playing video games together at home examined the spontaneous instances of teaching and learning that players set up among themselves during gaming sessions, as well as how their in-room interactions connect with what’s going on inside the game and in their lives outside the home (e.g., school). ” (Takeuchi and Reed Stevens, 2001)
  • Learn from teachers who’s roles are to provide: (Hidi & Renninger, 2006)
    • Positive feelings
    • Generate curiosity
    • Provide opportunities
    • Guide on research
  • Parents (coviewing)
    • “To get families to use a new platform with any regularity, it should easily slot into existing routines, parent work schedules, and classroom practices. There are, after all, only so many hours in the day to accommodate new practices.” (Takeuchi and Reed Stevens, 2001)
    • “What children learn and do with media depends a lot on the content of the media, but they depend perhaps as much on the context in which they are used or viewed, and with whom they are used or viewed.” (Takeuchi and Reed Stevens, 2001)
  • Society

Pedagogical Compass Planning

Act 1 – Why should you read this paper?

Want to become a learning tool designer? Care about learner? Care about teachers? Care about reducing the digital literacy gap?

Who are you? Teacher? Policy maker? School leader? Designer? Engineer? Developer?

The compass:

– North – what we teach

– South – how we teach

– East – how we learn

– West – who we learn from

 Act 2 – Evidence

How is LX design similar and different from:

UX designer – consider:

– User

– Usability

– Task at hand

Game Designer

– Learning the game – onboarding instructions

– Engagement – motivation, interest, reward systems, core loop

– Game mechanics

Learning Experience Designer

– Learning objectives

– Differentiation

– Cognitive developmental stages

– Cultural context

– Joint media engagement and co-viewing

– Learning from peers – protégé effect, learn by teaching

Act 3 – Conclusion

LX is probably the most complex type of design there is. Have to consider:

– The learner

– The teacher

– The environment

– The peers

– The cultural context

– Assessment

– Learning objectives

– Policy

– Costs

– Implementation

– Scalability

Qualitative Research – Final Reflection

Assignment

Individual Process Paper Requirements

This paper is a final reflection on the process of doing qualitative research. In this paper you should:

  • Describe your growth as a qualitative researcher over the past 10 weeks using concrete details and examples to demonstrate areas of growth as well as areas you are still mastering
  • Reveal how you are pushing yourself toward new understandings, especially concerning the complexity of the research process
  • Connect your experience to class readings and class discussions. Show us some key topic areas you are grappling with… Be sure to use proper APA format

You may want to revisit past RDRs and show how your thinking has progressed. You may want to reflect on topics such as contextual interpretation, subjectivity, ethics, the analysis process, validity, and rigor.

Process papers should be between 4 and 8 double-spaced pages, not to exceed 8 pages.

Group mini-products will be evaluated separately from individual process papers. We will average the group grade on the mini-product with the individual grade on the process paper. 

Response

Abstract

This paper is a review of the learning process I have gone through this quarter in this class in the form of a qualitative research paper. I propose to expose my journey from someone who had barely ever thought about research, let alone qualitative research, to someone who is now able to appreciate the power of this method of analysis of the world around us. Instilled with my own bias and metacognition, I will describe what were the salient concepts acquired through the readings, class activities, and assignments.

The research question I want to answer is: “How does Lucas understand the qualitative research process?”

Introduction

As an engineer undergraduate, research for me was far into the realms of Doctoral students and the confines of microfilms in the libraries. As a worker, I was always involved in project management and the implementation of software systems. Always very hands on and practical work with little need to do or consume research.

Coming into the LDT program I had to decide between qualitative and quantitative research methods. My reasoning was that I’ve got some statistical background from Industrial Engineering and that I had virtually no contact with qualitative research. I have not regretted this decision and feel that the course has provided me with valuable skills for observing the world and for consuming and producing qualitative research. It has given me a whole new set of lenses to critique my own design and thought process.

Hopefully this paper will illustrate the main take aways from the course along with evidencing my learning process and methods. By no means I am intend to claim that this qualifies as true qualitative research as the process of data collection and analysis was not initiated as such – it was an afterthought that induces a top down approach to finding meaning. I came in with what I wanted to find in the data and found it. My personal bias is also exacerbated by the fact that I am a full participant-observer (Taylor & Bogdan, 1998). I tried to be as objective as possible and hopefully attended to at least some of the “Criteria for a Good Ethnography” (Spindler & Spindler, 1987, pp.18-21):

      1. Observations are contextualized: I attempted to describe my individual process in this paper yet leaving out the in-class description since the intended audience of this paper were part of this context.
      2. Hypothehis emerge in situ: the learning process and this paper shows evidently that I came in with no prior knowledge of the subject and came out with what I feel like a solid basis for future work.
      3. Observation is prolonged and repetitive: is a quarter long enough? Was I really observing repetitively my own actions? I could argue towards both ends of the spectrum where if I was not consciously observing myself with the purpose of this research paper, the observations were not made. On the other hand, my blog, assignments, and memory serve me with sufficient data for this analysis.
      4. Native view of reality in attended: well, I don’t think I can go more native that being the native myself.
      5. Elicit sociocultural knowledge in a systematic way: the process of maintaining every interaction with course documented in my blog could be considered a systematic approach to eliciting my sociocultural knowledge even though there is no record of sociocultural factors that might have affected my learning.
      6. Data collection must occur in situ:  in the sense that I am collecting data from myself, I would consider that all data collection was collected by me, for me, and within myself.
      7. Cultural variations are a natural human condition: I was unable to find throughout the process that my cultural background somehow affected my learning. Even though I am from Brazil, my education has been entirely   within the American and British systems, allowing me to feel ‘at home’ in this context and with the readings presented.
      8. Make what is implicit and tacit to informants explicit: hopefully I am able to layout implicit behaviors and communications patterns in this paper by detailing my thought process behind each claim.
      9. Interview questions must promote emic cultural knowledge in its most natural form: I used the questions presented in the description of this assignment as a guide during my self-mental-interview. I feel like they were sufficient to elicit what I have learned.
      10. Data collection devices: I used pencil, paper, camera, and the blog as devices to collect my data.

Surprisingly, according to this analysis above, this paper could very well be qualified as a qualitative research paper. As discussed in the last class of this course, there are several examples of alternative and artistic research such as poems, performances, novels, and documentaries. ‘The field allows it all’ (notes from week 10 class, 2015). All in all I felt that this was a valid approach to structure and present the data collected, even though the data collection itself was not originally intended for the purpose of this paper – but for the purpose of learning.

Methodology

Data Collection

The structure of the course involved a series of readings, mini-lectures, in-class group discussions, individual papers, and practice of qualitative research. The main topics covered were presented in a logical progression (Appendix A) that scaffolded our understanding towards the existing base knowledge about the field. A series of readings were assigned to support our in-class discussions and to present the current research and thinking about each topic. Written assignments were used to assess the class’ progression through the course. Finally, we conducted a short practice version of qualitative data collection and then ensued to analyze the data and present a mini-product.

My methodology for absorbing the content was primarily to be engaged with the content by attending all classes, reading and writing all that was assigned. While reading and during class I noted down important concepts that jumped out at me on paper. I was testing the notion that by going analogue and physically writing down my thoughts I might get the benefits of embodied learning: “The embodied interaction with things creates mechanisms for reasoning, imagination, “Aha!” insight, and abstraction. Cultural things provide the mediational means to domesticate the embodied imagination.” (Hutchins, 2006, p.8) These notes were then photographed and put in my blog (lucaslongo.com) for archival purposes.

Data Analysis

For this paper, I wallowed through the data – my notes – and interviewed myself mentally about the entire experience. I produced amended notes that summarized general knowledge pieces I have absorbed (Appendix B). These notes were initial guides as to the subject matter to be included in this paper. They also inspired me by presenting me with the opportunity to experience grounded-theory (Taylor & Bogdan, 1998) in the sense that writing this paper in a qualitative research paper was the best way to present what I have learned from this course.

In being a hyper-metacognitive participant observer in this research process, I will now present the main propositions from the readings and the practice research process.

Findings 

The assignment of conducting qualitative research was a crash course in the field. Even though highly structured and scaffolded by the educators, the process allowed for experiencing the multiple steps, processes and analysis required. The progression of observing, preparing interview questions, interviewing, making sense of the data, and finally writing it up felt like a genuine simulation of the real thing. 

In particular, we had very little time to come up with a context we wanted to observe and define a research question that interested us. For me that was and still seems to be the hardest step of research: what is an interesting question to ask? Is there a problem to be found? How much research has already been done in this area? Do I know enough about the context to be able to extract meaning from it? But I guess this key and the seed of all research, alluding to the “1% inspiration, 99% perspiration” mantra that echoes in my head from my undergraduate studies.

The observation and interview processes did not draw up many insights for me other than the interview questions preparation phase. I had never structured an interview before and found that the strategies discussed in class and in the readings were extremely helpful for understanding how to better extract information from the informants. Probing and markers were the concepts that most stood out for me as techniques that I will take with me.

The process of analyzing the data and writing up the product showed me how much data was collected from a simple one hour observation and two hour long interviews. I was also surprise as to how much meaning can be extracted from micro-analyzing what was said by the informant. Not to mention the fact that our final conclusion or theory, truly emerged from the data. My group was worried that as much as we discussed, we did not feel like we had anything interesting to say about our context. At the last moment, when arranging the propositions, a general cohesive thought emerged from them, allowing to generate a conclusion that was both backed by evidence and that had meaning for us. I was initially skeptical about the method of coding exhaustively the data yet I was completely debunked in my convictions having experienced it first hand.

Finally, one framework that I found very helpful in the process was Petr’s diagram (Appendix C) that made the process somehow tangible in my mind. It is a great representation of grounded theory and the qualitative research process. Obviously this diagram was backed up by our readings and fruitful class discussions, without which it would not have had such an impact on me. It especially helped my in thinking about and creating propositions, the claims that we could back up with evidence all the way to the future implications of our findings: “Turtles all the way down”.

Conclusion

Throughout this paper, I attempted to summarize the learning process I went through and what I learned from this course. It has further consolidated my learning about qualitative research, validated some of my learning methods, and made me aware of all the pedagogical techniques designed into the course itself. Considering I would not have been able to engage in a meaningful conversation with other qualitative researches prior to this course, I consider this experience a success in learning (and being taught) about the field. Thank you.

Looking ahead I see room for improvement in my writing skills especially in citing previous research. This ties into to my technique of reading and note taking. I look back at the readings and find no highlights of meaningful phrases. My notes as a photograph on the blog are not searchable. Because of this I had to go back into the readings again to pull out citations. I had to try to understand my sometimes messy handwriting and make sense of it. With this in mind I am abandoning hand written notes in favor of going straight to digital.

I also feel that I have to work on my own master’s project research question and start to plan out my research. I feel that this class gave me significant skills, techniques, and concepts to be able to do so. My entrepreneur traits have a tendency to look for a solution with a top down approach. Now I have grounded theory to reduce my anxiety of getting to ‘The’ solution – I see that I must dive into context I want to meddle in, observe it exhaustively, understand how the natives navigate, analyze and then finally be better equipped to propose, claim and who knows solve a problem.

Appendix A – Course Progression

Concepts

  1. The Nature of Qualitative Research
  2. Qualitative Methods — Why and When
  3. Data Collection: Observation
  4. Data Collection: Interviewing
  5. Examining Subjectivity
  6. Analysis: Making Sense of the Data
  7. Considering Validity and Rigor
  8. Ethical Issues

Readings

See Reference section below

Assignments

  1. RDR #1: The Observation Process
  2. Qualitative Research Critique
  3. RDR #2: The Interview Process
  4. Draft of “mini-products” 
  5. Qualitative Product Paper
  6. Qualitative Process Paper (this paper)

Appendix B – Amended Notes

Notes I generated in preparation for this paper:

Paper’s Structure:  

Act 1
Tell story from Week 1 – Week 10
Novelty of the subject

Act 2
Readings
Observation/Interview
Data analysis

Act 3
Main takeaways
Strengths and weaknesses
Room for improvement

Wallowing through blog notes:

Main take aways from the class: 

  • Qualitative research – or research itself.
  • The power of writing
  • Frameworks and concepts
    • Turtles all the way down
    • I as a camera
    • Grounded theory
    • Probing
  • Criteria for good ethnography
  • Participant observer – cool! Almost like spy work
  • Finding a research problem – that’s the hardest part I think
  • Interview preparation
  • Interview behavior
  • Coding – did not believe in it at first
  • Propositions – Petr’s diagram
  • Validity – just be clear how you wrote it – Geisha
  • Learning acquires you – Legitimate Peripheral Participation

 Pushing myself

  • Improve on writing skills
  • Read and read and read more research
  • Identify my own research problem
  • Tension between researching and creating solutions
  • Stand on giant’s shoulder and do something?
  • Become a giant for others to be able to do something?
  • Interview process I think I’d do well
  • Need to practice more in extracting meaning from data, not so instinctive for me – never has been – I take facts for face value – maybe a good quality for less-biased field data collection and data analysis.

Appendix C – Petr’s Research Diagram

Screen Shot 2015-12-08 at 12.11.49 PM.png

References

Note: references are not in alphabetical order to preserve chronological sequence

Reading Assignments:

The Nature of Qualitative Research

Merriam, S. (2002). Introduction to Qualitative Research. In S. Merriam & Associates (Eds.) Qualitative Research in Practice. San Francisco: Jossey-Bass. pp. 3-17.

Miles, M.B., & Huberman, A.M. (1994). Qualitative Data Analysis: An Expanded Sourcebook. (Second Edition). Thousand Oaks, CA: Sage. pp. 1-12.

Spindler, G. & Spindler, L. (1987). Teaching and Learning How to Do the Ethnography of Education. In G. Spindler & L. Spindler (Eds.) Interpretive Ethnography of Education at Home and Abroad. Hillsdale, NJ: Lawrence Erlbaum Associates. pp. 17-22.

Creswell, J. (2003). “A Framework for Design,” Research design: Qualitative, Quantitative and Mixed Methods Approaches (2nd edition). Thousand Oaks, CA: Sage. pp. 3 -24.

Becker, H. (1996). The Epistemology of Qualitative Research. In R, Jessor, A. Colby, & R. Shweder (Eds.) Ethnography and Human Development. Chicago: University of Chicago. pp. 53-71. (link)

Geertz, C. (1973). “Thick Description, “The Interpretation of Cultures. New York: Basic Books. pp.3-30. (link)

Data Collection

Taylor, S., & Bogdan, R. (1998). “Participant Observation, In the Field,” Introduction to Qualitative Research Methods. (Third Edition). New York: John Wiley & Sons. pp. 45-53, 61-71.

Glesne, C., & Peshkin, A. (1992). “Making Words Fly,” Becoming Qualitative Researchers: An Introduction. White Plains, NY: Longman. pp. 63-92.

Weiss, R. (1994). “Interviewing,” Learning from Strangers: The Art and Method of Qualitative Interview Studies. NY: Free Press. pp. 61-83, 107 – 115.

Subjectivity

Peshkin, A. (1991). “Appendix: In Search of Subjectivity — One’s Own,” The Color of Strangers, The Color of Friends. Chicago: University of Chicago. pp 285-295.  

Peshkin, A. (2000). The Nature of Interpretation in Qualitative Research. Educational Researcher 29(9), pp. 5-9. (link)

Analysis

Taylor, S., & Bogdan, R. (1998). “Working With Data,” Introduction to Qualitative Research Methods. (Third Edition). New York: John Wiley & Sons. pp. 134-160.

Charmaz, K. (1983). “The Grounded Theory Method: An Explication and Interpretation,” In R.

Emerson (Ed.) Contemporary Field Research: A Collection of Readings. Boston: Little, Brown. pp. 109-126.

Graue, M. E., & Walsh, D. (1998). Studying Children in Context: Theories, Method, and Ethics. Thousand Oaks: Sage. pp. 158-191 and 201-206.

Page, R., Samson, Y., and Crockett, M. (1998). Reporting Ethnography to informants. Harvard Educational Review, 68 (3), 299-332.

Emerson, R., Fretz, R., & Shaw, L. (1995). “Processing Field Notes: Coding and Memoing,” Writing Ethnographic Field Notes. pp. 142 – 168.

Validity and Rigor

Johnson, R. (1997). Examining the Validity Structure of Qualitative Research. Education, 118, pp. 282-292.

Wolcott, H. (1990). On Seeking –and Rejecting– Validity in Qualitative Research. In E. Eisner & A. Peshkin (Eds.) Qualitative Inquiry in Education: The Continuing Debate. New York: Teachers College. pp. 121-152.

AERA (2006). Standards for Reporting on Empirical Social Science Research in AERA Publications. Educational Researcher 35(6), pp. 33-40.

Anfara, Jr., V., Brown, K, & Mangione, T. (2002). Qualitative Analysis on Stage: Making the Research Process More Public. Educational Researcher 31(7), pp. 28-38. (link)

Ethics

Altork, K. (1998). You Never Know When You Might Want to Be a Redhead in Belize. In K. deMarrais (Ed.) Inside Stories: Qualitative Research Reflections. Mahwah, NJ: Lawrence Erlbaum. pp. 111-125.

 Lincoln, Y. (2000). Narrative Authority vs. Perjured Testimony: Courage, Vulnerability and Truth. Qualitative Studies in Education 13(2), pp. 131-138.

Products of Qualitative Research

Cohen, D. (1990). A Revolution in One Classroom: The Case of Mrs. Oublier. Educational Evaluation and Policy Analysis 12(3), pp. 311-329. (link)

McDermott, R. (1993). Acquisition of a Child by a Learning Disability. In S. Chaiklin & J. Lave (Eds.) Understanding Practice. Cambridge: Cambridge University. pp. 269-305. (link)

Rosenbloom, S., & Way, N. (2004). Experiences of Discrimination among African American, Asian American, and Latino Adolescents in an Urban High School. Youth and Society 35(4), pp. 420- 451. (link)

Other readings:

Edwin Hutchins (2006). Learning to navigate. In S. Chaiklin & J. Lave. (Eds.). Understanding practice: Perspectives on activity and context, pp. 35-63. New York: Cambridge University Press.

Qualitative Research – Final Product

Great working with James and Ana in this project!

Text version below and nicely formatted version here: Final Product.pdf

ABSTRACT

In this qualitative study, individuals involved with the Learning Innovation Hub (iHub) were studied to address the research question, “How does iHub facilitate collaboration between educators and entrepreneurs to promote education technology innovation and adoption?” To this end, an observation of the iHub fall 2015 orientation and two interviews with iHub Manager Anita Lin were conducted over the course of three weeks. iHub was found to facilitate collaboration between teachers and startups by seeing teachers as key agents in edtech adoption and focusing on teacher needs. iHub, in turn, does not focus on other stakeholders in the education ecosystem beyond teachers. This raises concerns about iHub’s impact on outcomes for learners.

(Keywords: education technology; edtech innovation; edtech adoption; iHub)

1 INTRODUCTION

Technology has the potential to revolutionize the ways in which we teach and learn. In recent years, a surge of education technologies has pushed more products into the hands of educators and learners than ever before. In fact, investments in edtech companies, too, have skyrocketed; during just the first half of 2015, investments totaled more than $2.5 billion, markedly surpassing the $2.4 billion and $600 million invested in 2014 and 2010, respectively (Appendix A) (Adkins, 2015, p. 4). In the 2012-13 academic year, the edtech market represented a share of $8.38 billion, up from $7.9 billion the previous year (Richards and Stebbins, 2015, p. 7). But how do educators find the education technologies that actually improve learning outcomes in a space increasingly crowded with many players and products?

The Learning Innovation Hub (iHub) is a San-Jose-based initiative of the Silicon Valley Education Foundation (SVEF) in partnership with NewSchools Venture Fund. Funded by the Bill & Melinda Gates Foundation, iHub aims to provide an avenue “where teachers and entrepreneurs meet.” iHub seeks to develop an “effective method for testing and iterating the education community’s most promising technology tools.” (iHub website).

To this end, iHub coordinates pilot programs of edtech products in real school settings. The iHub model involves:

(1) recruiting early-stage edtech startups with in-classroom products to apply to the program,

(2) inviting shortlisted companies to pitch before a panel of judges,

(3) selecting participating startups,

(4) matching startups with a group of about four educators who will deploy products in their classrooms,

(5) jointly orienting educators and entrepreneurs prior to the adoption of the technology in the classroom, and

(6) guiding communication among participants throughout the pilot and feedback phase.

iHub plays a unique role in the edtech ecosystem of Silicon Valley given its position as a not-for-profit program that does not have a financial stake in the startups. As such, we are interested in better understanding iHub’s impact on improving learning outcomes through technology. This study seeks to address the following research question:

How does iHub facilitate collaboration between educators and entrepreneurs to promote education technology innovation and adoption?

2 METHODOLOGY

We followed a prescribed sequence from framing our research question through data collection and analysis. Although we did not conduct a formal literature review on the research topic, members of the research team began the project with prior experience of education technology use and adoption in the classroom. We also conducted an informal observation of the organization prior to the official start of the project; we attended the iHub Pitch Games, during which the startups were selected for the participation in the fourth cohort. Our subject was selected based on a combination of convenience sampling and alignment of interest in the subject within our team.

2.1 Data Collection

We used three primary sources of data collection: online documentation, an observation, and interviews. This source triangulation roots the reliability of our findings and affords us various insights into the native view in order to understand iHub’s strategies for facilitating collaboration between educators and entrepreneurs.

We began our official data collection through the iHub website, which lays out the overarching priorities of the iHub program. The website afforded us a preliminary understanding what the program does, which we continued to access throughout the duration of the study. With this written information, we were able to compare what the program claims to do to what the program actually does, as demonstrated in the observation and what the program says it does, as elucidated by the interviews.

We continued our data collection by conducting a one-hour observation of iHub’s fall orientation (Appendix B). The orientation represents the first in-person point of contact between participating educators and entrepreneurs of the fourth iHub cohort. Coordinated by SVEF staff and spearheaded by Lin, it served as the ideal occasion for observation, as it showcased iHub’s role as a facilitator of communication and collaboration between educators and entrepreneurs. Uniting everyone together in the same room, the orientation dealt with everything from high-level discussions of the goals of iHub down to the administrative details of the initiative. Both raw and amended notes were kept by all three researchers.

        In the two weeks following the orientation observation, we conducted two one-hour-long interviews of Lin. Lin was selected as the ideal interview subject given her accessibility as gatekeeper to the research team, her position as iHub Manager, and her deep understanding of the iHub initiative. A peer-reviewed interview guide was used in both interviews, though interviewers let questions emerge in situ as appropriate. The first interview sought to garner an understanding of the overarching goals and priorities of both SVEF as the parent organization and iHub as the specific program of interest. We explored what the organization does, what their processes look like, and Lin’s role within iHub. While the interview uncovered some of the areas for deeper discussion, we were intentional in keeping inquiries of the first interview at an introductory level and saving probes for the second interview. The second interview, in turn, honed in on a more granular discussion of iHub’s role in the technology adoption process and learning outcomes. Both interviews were voice-recorded, and approximately thirty minutes of each interview were transcribed (Appendix C).

2.2 Data Analysis

Our data analysis process went hand-in-hand with our data collection process, allowing us to make adjustments of our concepts, themes, and categories throughout our research.  While we did not create memos per se, individual research descriptions and reflections served to clarify and elucidate some of the themes and insights that emerged throughout the process. Raw and amended notes and interview transcriptions were coded with the following jointly designed list of codes:

  •   Educator feedback
  •   Entrepreneur feedback
  •   Examples of success
  •   Examples of challenges
  •   Focus on early-stage startups
  •   Focus on educators
  •   Funding partnerships
  •   Metrics for success
  •   Neutrality
  •   Opportunities for improvement
  •   Organizational design
  •   Stakeholder alignment
  •   Tension between decision makers
  •   The iHub model/framework

From there, we were able to identify themes and patterns in our data. As our research question seeks to understand a phenomenon, a grounded theory approach proved most appropriate. This grounded theory method, we derived the propositions described below.

3 FINDINGS

3.1 iHub sees teachers as key agents in edtech adoption.

        iHub sees teachers as key agents in edtech adoption. While the organization understands that entrepreneurs, school principals, district managers, and policy makers are all stakeholders in this process, they view teachers are the strongest drivers:

What we have heard from teachers and from districts, is that a lot of times for a school for adopt or…use a product across their school, it’s because a group of teachers have started of saying “I’ve been using this product. I really like this product. Hey, like friend over there! Please use this product with me,” and they are like, “Oh! Yeah we like it,” and kind of builds momentum that way (Interview Transcripts, 2015).

Under this notion that teachers can be strong advocates of edtech products, iHub is looking to adjust its curriculum around teachers as the key agents:

“So we kind of have been thinking about how do we build capacity of teachers to advocate for products they think are working well” (Interview Transcripts, 2015).

They also initiate their pilot cycles with the teachers defining what their current needs are, prior to selecting the entrepreneurs that are going to participate:

“So we send out to our teachers and they’ve kind of, I would say vaguely, have defined the problem of need, and we’d like to kind of like focus them on the future.”

Innovation then, is driven by what the teachers need in the classroom. These teachers are hand picked based on their proficiency in adopting technology and likelihood of giving better feedback and needs statements:

I think teachers who we pick, we try to pick ones who are very…very experienced with using tech in the classroom and so I think that you, you find that teachers who use tech in the classroom, you…it’s like their instruction is different (Interview Transcripts, 2015).

Going further, iHub wants to promote a community of practice to enable discussion and scaffolding amongst teachers open to edtech adoption:

And so I think our program is also to help teachers who are early adopters of technology, help them kind of meet other teachers at different school for early adopters, and build a cohort that understands that and kind of can refer to each other (Interview Transcripts, 2015).

Finally, when looking at entrepreneurs, iHub sees the teachers as the key agents to their success:

So I think that that’s why we’re working with early stage companies because I think it’s, it’s possible to find one now that meets the needs of many teachers and kind of help it kind of just move along (Interview Transcripts, 2015).

3.2 iHub has a focus on teacher needs.

iHub’s belief that teachers are key agents in edtech adoption leads it to focus on teacher needs. Many of iHub’s processes and resources revolve around satisfying the needs and constraints the teachers might have in the process. For instance, the orientation – albeit an event bringing together all participating stakeholders – was framed around the needs of teachers. Rhetoric revolved around “how do we choose the technology we use in the classroom?” (Observation Amended Notes, 2015).

The focus on teacher needs also became evident during our interviews with Lin.  In fact, the program’s existence is rooted in the perceived needs of teachers:

Schools DO need edtech products…They want products that do x, y, z. But they don’t really know how to go about and find them. So I think that that’s why we’re working with early stage companies because I think it’s, it’s possible to find one now that meets the needs of many teachers and help it just move along (Interview Transcripts, 2015)

Lin explicitly described one of iHub’s main goals as:

To help teachers who are early adopters of technology, help them kind of meet other teachers at different school for early adopters, and build a cohort that understands that and kind of can refer to each other…we kind of want to help teachers understand how to use it—edtech—in their classroom (Interview Transcripts, 2015).

Furthermore, Lin acknowledged that iHub gives teachers various opportunities to communicate their needs:

There are lots. Every time we have meetings, we are very…open about that. And I also think, we have surveys, so there’s a lot of, we send out a lot of surveys about a lot of…different, specific, different…happenings, and so after…orientation happened, there was a survey that was sent out about that. After the Pitch Games happened, there was a survey about that…I also think that during the rounds if we have strong relationships with teachers, which is typically the case, then teachers are very open with us. At the end, I’ll be like, you know, we’d love to hear your feedback, and they’ll just tell us, you know, we’d love it if there was this, this, this, this (Interview Transcription, 2015).

Lin also shared two instances in which feedback from teachers was implemented to improve this focus on teacher needs. In the first, Lin pointed out:

We’ve made a lot of those changes based on teacher feedback.  Like for example, the reason why teacher teams are at a school site this year instead of from all different schools is partly because it makes sense, I think, to scale, but also because that was one of the big pieces of feedback that was given from the beginning (Interview Transcripts, 2015).

In the second, Lin highlighted a major change in the mechanics of the pilot program where instead of having one teacher per school, they are now working with teams of teachers from each school. This change was based on teacher feedback that wanted more collaboration amongst themselves:

Yeah, so I think for our teachers we would like them to meet up kind of weekly. And when you’re not at the same school it’s a lot harder to meet on a weekly because maybe one night one school has their staff meeting and then the other night the other school has their staff meeting and then, you know, I think it was a lot of commitment to ask and I think a lot of teams found it really challenging and maybe would not always be there because of that (Interview Transcripts, 2015)

Additionally, in her description of the structure of the program, Lin conceded that the number of companies accepted into the program is contingent on the availability of teachers: “For us its capacity of teachers…In our last round we supported 25 teachers. And this round we have about 13 teams of teachers” (Interview Transcripts, 2015). In fact, the single metric for success of the iHub program that Lin identified when prompted was the number of teachers using the iHub model (Interview Transcripts, 2015).

In the discussion on iHub’s priorities, Lin also focused on outcomes for teachers.

3.3 iHub does not focus on the needs of other stakeholders beyond teachers.

Noticeably, we found no evidence that iHub is focusing on the needs of any other stakeholder group beyond teachers. At the foundation of this proposition lies the evidence supporting the previous proposition that iHub is focused on teacher needs; in other words, if iHub is focused on teacher needs, it is by default not focused on the needs of other stakeholders. The following evidence further proves that iHub does not focus on the needs of other stakeholders.

When asked directly what the ideal relationship between schools and startups would be, Lin responded, “An edtech vendor is a provider, right? So they should be providing some service that fits a need that a school has or a teacher has or a student has in some way” (Interview Transcripts, 2015). She added, “They’re still…an early stage company so they’re…still growing and figuring out exactly what it looks like.” (Interview Transcripts, 2015).

When asked about the school’s reciprocal responsibility to startups, Lin responded, “I don’t know, I never thought about that, it, as much that way” (Interview Transcripts, 2015). iHub does not have any expectations for what a teacher should contribute to the relationship.

iHub acknowledges that communication problems have arisen when the startup’s focus is diverted from the pilot program, “And so they became pretty unresponsive with our teachers. The teachers like, emailing me, and I’m like trying to get in contact with it.” However, iHub does not have a process for holding startups accountable, “And so typically when there’s not communication between these parties…the pilot would not be as successful as it could because they weren’t communicating” (Interview Transcripts, 2015)” iHub does not have the relationship with startups to communicate in order to address an issue such as this.

Lin even identified for improvement in iHub’s relationship to stakeholders such as entrepreneurs and school districts:

And so how do we support districts where maybe they’re not as on top of edtech, how do we support their administration so that they understand the role of it, understand maybe how select it, and understand how teachers use it so they can provide the support both maybe in resources but also in professional development to their teachers (Interview Transcripts, 2015).

But on a broader scale, iHub demonstrated shallow knowledge of the intricacies of the larger education system:

I would say I don’t know enough about school districts and about school…counties, offices, to be able to know whether or not they’re functional. There’s a lot of bureaucracy, I think, that comes up when you work with the county and work with…there’s so many different needs and so many different people kind of working on it that sometimes…they can’t, they’re unable to kind of do certain actions because of different reasons, whatever they are. So I don’t know (Interview Transcripts, 2015).

And finally – there is rare evidence of any kind of direct preoccupation about learning objectives and student satisfaction. The few moments that students were mentioned follow:

I talked about how it helps students but really a big point, I think a big selling point for districts is that it helps teachers, we give a lot of teacher professional development during that time. (Interview Transcripts, 2015)

Yeah, so I think we just want to make sure the, the products are really relevant to students, right. And so, that’s the way we do it, is that you get feedback from students and teachers, but I think those needs change, right? Each year, this year, the needs are different than last year, because this year you’re using Common Core, and last year, maybe, it wasn’t as big of a, actually it was really big last year. (Interview Transcripts, 2015)

Only when prompted for a specific success story did Lin share the story of one student with some light motor disabilities who learned from the entrepreneur’s product:

I think also a lot of the other teachers who worked with that product really, their students really enjoyed it, because it is really engaging and they were making like, connections between the fact that, you know, I’m doing math… And I think that was a like a really wonderful example of a product that went really well (Interview Transcripts, 2015).

 

4. LIMITATIONS

In order to have had a better sense of what edtech adoption really looks like within iHub’s process, we would have to have done several further observations. We were able to see the process in its very early stages but feel that observing the classroom setting and feedback meetings would reveal more interesting data.

What we observed was also not as relevant to the overall research question as were the interviews. What we saw was the very first meeting between teachers and startups. We observed the initial questions and doubts about the use of the product, yet no teacher had played with it yet. It would have been interesting to observe the product already running in the classroom.

        Our limited previous knowledge of what the company did, what we were observing and their overall goal also reduced our capacity to probe deeper into the subject matter. We might have chosen to interview a teacher or a startup instead of the manager of the program, for instance.

5 CONCLUSION

iHub views teachers as key players to edtech adoption given their position to advocate for technology products among their colleagues and other stakeholders within the system, and this view of  teachers has led them to focus on teacher needs. Our evidence demonstrates that iHub’s goals, its active pursuit of teacher feedback, the changes implemented within the program, and the program’s metrics for success all point to this focus on teacher needs. Insofar as iHub is focusing on teachers’ needs, it can no longer prioritize the other stakeholders in the education ecosystem. In fact, our evidence shows that this has led to a weak relationship with other stakeholders including entrepreneurs, districts and learners.

5.1 Implications

Our conclusion raises concerns with the ethics of iHub’s facilitation of edtech adoption. In optimizing to meet the needs of teachers, iHub is not focused on optimizing for learner outcomes. In our evidence, there is less priority given to students, learning objectives, and teaching pedagogy. iHub is trusting that teachers operate with the best interests of their learners in mind, but we cannot be certain that this is always the case. We do not actually know the real implications of the iHub program on learning outcomes, but no research is being done to understand whether there is benefit or detriment to students.

REFERENCES

Adkins, S. (2015). “Q1-Q3 International Learning Technology Investment Patterns,” Ambient Insights. http://www.ambientinsight.com/Resources/Documents/ AmbientInsight_H1_2015_Global_Learning_Technology_Investment_Patterns.pdf

Learning Innovation Hub website. Retrieved on 2015, December 7 from http://www.learninginnovationhub.com/

Richards, J. & Stebbins, L. (2015). “Behind the Data: Online Course Market–A PreK-12 U.S. Education Technology Market Report 2015,” Education Technology Industry Network of the Software & Information Industry Association. Washington DC: Software & Information Industry Association

APPENDIX A: Investments Chart

Source: Adkins, S. (2015). “Q1-Q3 International Learning Technology Investment Patterns,” Ambient Insights. http://www.ambientinsight.com/Resources/Documents/AmbientInsight_H1_2015_Global_Learning_Technology_Investment_Patterns.pdf

APPENDIX B: Orientation Observation

Screen Shot 2015-12-07 at 6.25.16 PMScreen Shot 2015-12-07 at 6.25.26 PM

APPENDIX C: Interview Transcripts

Interviewee: Anita Lin, ‎iHub Manager at Silicon Valley Education Foundation (SVEF)

Interview 1: Oct 29 2015

Audio file: download

Interview 2: Nov 5, 2015

Audio file: download

13:49:00

[Ana] What would you say, um, the goals specifically of innovation is, the innovation group within SVEF is?

[Anita] i think the goal is to find…find innovative things that are happening in education and help support their growth. That’s what I think the innovation side is focused on.

[Ana] Ok. So you pointed to the three different stakeholder groups that are kind of important as you’re going towards your mission, which are students, teachers, and, um, districts [yeah], um. But you said that they’re not always aligned. And so how does, in the work that you’re doing bringing the different stakeholders, and adding even a fourth stakeholder to that, how do you try to align those different groups?

14:53:01

[Anita] Yeah, that’s a good question…We, when I think about it, what I mean is mostly that when you find ed tech companies and you recruit them, some ed tech companies are focused on students and the classroom experience; some ed tech companies are focused on the school experience, or like maybe making life easier for teachers, which I would consider different than a product that…instructs students for math; and then some products, right, are learning management systems, and those are for your district. You want to be able to use them across the district so that all the information is centralized. And so those, so when, so depending on the person who’s looking at a product and their position in that whole spectrum – a student, a teacher, a principal even, an instructional coach, or like someone in the district – the way they look at a product is different. That’s what I meant by that. [mhm] So, does that answer your question=

[Ana] Yeah, yeah, yeah absolutely. And how, could you speak to the challenges of, like, actually aligning [yeah] those groups.

[Anita] Yeah. I think that’s always a tension that happens in education, not just…within our work, but as a whole…sometimes, for example, here’s an example that I ran into last weekend. Someone was telling me, they worked with one of our companies previously. But…what happens in their district is very, I think they’re very on top of the policies basically, and so they approve certain, certain companies for use in the classroom because they meet all the privacy laws and all those…all these requirements they set, so I think privacy laws and more. And so because for some reason the district didn’t approve this one product she had been using before, and so this year she can’t use it. And so, right, to her, she, the way this teacher sees it is like, well, like my students really want to do it, I really want to do it, why can’t I just do this? But then the district sees it as like, you know, we have a process. This didn’t fit our criteria for some reason or the other. And so therefore, we don’t allow it. Right? And so then there’s that tension, and I think we’re still figuring out how you solve that. [yeah] I think it’s, I think it’s a tension for anything, ’cause even curriculum, that happens in curriculum…textbook curriculum adoption. So, [yeah] yeah.

17:17:05

[Ana] So do you see a role for SVEF, in that specific situation, to facilitate [yeah] alignment?

[Anita] So we kind of have been thinking about how do we build capacity of teachers to advocate for products they think are working well. We also have been thinking about how we support districts in understanding ed tech. So if a district, right, we know that some districts are really on top of ed tech in Silicon Valley and some are not. [mhm] And so how do we support districts where maybe they’re not as on top of ed tech, how do we support their administration so that they understand the role of it, understand maybe how select it, and understand how teachers use it so they can provide the support both maybe in resources but also in professional development to their teachers.

[Ana] Can you give a specific example?

[Anita] Yeah, so we haven’t done this yet exactly, which is why maybe I don’t have a great example, but in the spring, we’re thinking about how do we build capacity of the district. And so we are thinking about convening some…instructional tech directors in a meeting and having them kind of talk about challenges they faced or things they’ve done really well in implementing education technology in the classroom. And so some work that supports this (inaud), which I mentioned earlier, is that we do these ed tech assessments where we go to different school districts and…walk them through an ed tech assessment from hardware all the way to software. So do you have enough access points? To do you provide training for your teachers when you do Google Classroom or whatever product they’re using. So we kind of want to use that to support our teachers.

18:55:02

[Ana] Yeah. Do you have a, (three-second pause) I guess, (three-second pause) where would you be, what point in the process are you in this now, if you’re thinking about it for the spring?

[Anita] So we have been, that’s a good question, so we’ve, we’ve done a couple ed tech assessments in the area that we’re kind of targeting right now so the East Side Alliance area…and we are targeting the last week of July as like, sorry not the last week of July, January, as…this director get-together…so I think we’ll kind of get an aggregate report from that data and then run some sort of roundtable with these directors. So that’s kind of what we, we have the idea, we kind of have an idea of when it would be. We are working with Price Waterhouse Coopers, PWC, with, to implement this work. And so they’re creating a project plan currently. And so then we’ll kind of partner with them to execute on that.

20:03:00

 

26:40:00

 

[Lucas] You’re good?

 

[Anita] umhum

 

[Lucas] Alright so… hum… you guys good… hum… so… I think, hum… we’re gonna dive into a little bit more about the model you mentioned

 

[Anita] Ok

 

[Lucas] So if you could tell me in your own words what’s the process that the startups go through with iHub prior to Pitch night?

 

[Anita] So we’re recruiting startups that are early stage so, what I would say we define that between Seed and Series A, hum… but I think it’s probably more broadly interpreted than that and so… We kind of reach out to contacts we have in the Bay Area and maybe a bit nationally and ask them to help us pass on the message that we are kind of looking for Ed Tech companies that are, that could be used in some classrooms specifically.

 

[Lucas] Ok

 

[Anita] From there companies apply online through a, like, a Google Forms. It’s pretty simple. It’s a very short application process, but I do think we’ll probably add to that next year. hum. And then following a certain time period I convene the invites of different people to be part of a short list committee. And so our short list committee consists of venture capitalists, it consists of accelerator partners and then also people from the education community so that typically is maybe a like an Edtech coach of the school or an IT Director at a school. Hum. Potentially some educators as well. So we kind of bring together this, this committee that, from all of our applications we f it down to 12. Then we ask those 12 to pitch other pitch game and we kind of ask them “Hey, focus on things you used in STEM classrooms” and we, we invite judges that are business judges. So typically CEOs of big companies in the area and then also education leaders. so we had [name of person] one. There’s also, hum… we’ve also had people who (whispers) trying to think of who else… (normal) we’ve had educators, we’ve also had IT Directors as well ‘cause we kind of think, you know, they’re different slices of the education world so we have both of those be there and then they pitch and then the judges ultimately select the pool of companies that we work with for the round.

 

[Lucas] So you mentioned there’s 12 applicants… 12 selected [uhum] and then how many go to hum, the actual orientation?

 

[Anita] We pick between 6 to 8 companies [6 to 8] This last round we picked 6 hum, I think we have 11 pitches so that’s probably what we have.

 

[Lucas] Uhum… Is there any, hum… reason for this number?

 

[Anita] For us its capacity of teachers [capacity of teachers] so we support, in our last round we supported 25 teachers. And this round we have ’bout 13 teams of teachers. And so we kind of didn’t wanna companies to support more than 2 or 3 although I think… we… we… we didn’t wanna it to be super challenging for companies to support and also since they are early stage products, we found that some companies as they’re taking off, like, they get really busy and they’re like, completely immersive so… I think it’s to balance both the support aspect but as well is kind of the teachers that we can support also.

 

[Lucas] Uhum… so let’s go a little bit back, uh, what happens between the pitch night and orientation in terms of your interactions?

 

[Anita] So we send out to our teachers and they’ve kind of, I would say vaguely, have defined the problem of need, and we’d like to kind of like focus them on the future. Make ’em define it a lot more clearly. Hum… but have… we send out… I send out a form that kind of says, you know, “You’ve seen these companies at the pitch round. Here’s some more information about them if you’d like”. And I ask them to preference these different companies. So like, 1, 2, 3, 4 I mean we have them preference them whether or not, they’re going to work with all of them. And so, we… then they preference them and I kind of match them typically if I can I just give them their first choice of company that they’d like to work with cause I think that (mumbles), builds a lot of  investment in our process, hum… and then by orientation they know who they are working with and then we kind of tell them that all of… We’ve already told them all the program requirements before but we kind of go over them at orientation and then go over… let them meet their companies for the first time.

 

[Lucas] Great. And how about after orientation? What happens?

 

[Anita] So after the orientation we kind of let them go and set up their products for about a week or two depending on the time crunch we have from the end of the year and then… for the next couple of weeks they use the products in the classroom. There might be some observations but I would say these observations are mostly from a project management perspective more than like, an evaluative one. And then they submit feedback. And so we have some templates that we give them that we ask them to submit feedback from. There’s probably have a guiding question for each one and each week we’ll update that guiding question. Also we use a platform called Learn Trials which kind of gets qualitative feedback generally from these teachers about the product and includes comments but also has a rubric that they kind of use. And we’ve asked for pre and post assessments in the past that our teacher created ahm, but this probably hasn’t been… we have not been doing that. I think we need to find a better way to incorporate, so…

 

[Lucas] So, so… tell me a little bit more about this tool for the Qualitative Assessment. You said the name was?

 

[Anita] Learn Trials – and so they have a rubric that assesses an ed tech company across different strands whether that’s usage, whether that’s how easy was it for it to set up. And they kind of just rate them almost like grading it you know, like give it an A, give it a B. So like kind of like over time. And we ask them to submit it in different, like different… on a routine, so every 2 weeks or something. Where you’re able to kind of see how the product performs over time.

 

[Lucas] And am I correct to assume that after orientation the process goes towards, until the end of the semester?

 

[Anita] Yes – so it’s only until the end of this semester. So typically December, I want to say like  18th

 

[Lucas] And then what happens?

 

[Anita] And then at the end of this orientation we SVF maybe with the help of some of our partners like LearnTrials will aggregate some of this data and will share that out with the community. Additionally for this round something we’d like to do is maybe then from our 6 companies that we work with, work with a few of them and help them… help support implementation in the school versus just a couple classrooms that a school. So we’re still figuring this spring what that exactly looks like in terms of the implementation of the, these products but that’s something that we’d like to do.

 

[Lucas] And when you say community you mean both teachers, schools and the EdTech companies? You share it with everyone?

 

[Anita] Yeah

 

[Lucas] Hum… so what other events or resources you provide that have like similar goals or priorities? Or is this the only…

 

[Anita] Like within our organization? [yes] Well, in terms of teachers support, like, our Elevate Program I know… I talked about how it helps students but really a big point, I think a big selling point for districts is that it helps teachers, we give a lot of teacher professional development during that time. And so I think our program is also to help teachers who are early adopters of technology, help them kind of meet other teachers at different school for early adopters, and build a cohort that understands that and kind of can refer to each other. Humm… so we also do some, some I would say professional development is not as extensive as all of it is but we kind of want to help teachers understand how to use it, EdTech in their classroom. Potentially, referencing… Our goal is to reference the SAMR model. So..

 

[Lucas] Uhum… And is this whole process the first cycle you guys are going through, or you have been doing this for a while?

 

[Anita] So we started our first round in the Spring of 2014 – so this is technically round 4 but we’ve itter… like… it changes… little pieces of it change each round. So in the past when we’ve done it, when I run it, it was just I would recruit individual teachers from schools and so then I would form them onto a team so maybe a school, a teacher from school A, a teacher from school B, and a teacher from school C. And in this round I re…, we did recruitment where I recruited teacher teams. So now it’s like 3 teachers from school A, 3 teachers from school B, and then they are all using the same product at their school site so I think that helps with the piece of collaboration that was harder to come by earlier.

 

[Lucas] And how was it harder?

 

[Anita] Yeah, so I think for our teachers we would like them to meet up kind of weekly. And when you’re not at the same school it’s a lot harder to meet on a weekly because maybe one night one school has their staff meeting and then the other night the other school has their staff meeting and then, you know, I think it was a lot of commitment to ask and I think a lot of teams found it really challenging and maybe would not always be there because of that. Hum… So… that was a big shift from that. But I think it really builds a community within their school. And I think, what we have heard from teachers and from districts, is that a lot of times for a school for adopt or you know, use a product across their school, its because a group of teachers have started of saying “I’ve been using this product. I really like this product. Hey, like friend over there! Please use this product with me,” and they are like, “Oh! Yeah we like it” and kind of builds momentum that way [uhum]

 

[Lucas] So yes, so I guess that talks to the implementation phase of the, of the software that they were trialling. Hum… could you tell of us of a, a specific hum, aaaaa, ww… what do you call this phase after orientation? the pilot? [the pilot] phase. The Pilot Phase. So. Yeah. Could you tell us one story that things went really well or things went really badly?

 

[Anita] Sure! So, there was a product that as used in the last round where I felt like, it was really… we saw a lot of interesting things happen, hum… but they’re all like lot of qualitative metrics. So it”s called Brain Quake, and actually the CEO and cofounder, he’s the… he actually was an LDT graduate, hum… but…  it’s basically this game on an iPad or… whatever, where you can play… you have little keys and you have to line the keys to get this little creature out of a jail essentially, but,  what was interesting is when you turn the gears it also kind of teaches an eight number sense, so it’s like, this interesting puzzle that kids kind of enjoy solving. And so he was using this in some classrooms in the Bay Area. Also one in Gilroy and this teacher was a special ed teacher. And so she was kind of showing them this and so… What I think was really, really successful that I found was that for one of her students, they had trouble with like motors skills. And so one of the skills that they had trouble with was kind of like turning the gear on the iPad. hum. But the student actually learned to turn the gear like to the left. Cause you can turn it both ways and they were able to like, learn that skill moving like, doing a different motor skill than they had before because they really wanted to play the game. And so I thought that was like a really wonderful example of how technology can be really inspirational or like really helpful versus I think their other, you know. Well, lots of examples in literature where technology just like, you know, it’s just a replacement for something. Hum… and so… I think also a lot of the other teachers who worked with that product really, their students really enjoyed it, ‘cause it is really engaging and they were making like, connections between the fact that, you know, I’m doing math and they could see h… they could understand that, you know, if I redid this into an algebraic expression… like they were coming up with that terminology and then they were like “we could just rewrite this into an algebraic expression”. And I think that was a like a really wonderful example of a product that went really well.

 

[Lucas] Did that product end up being hum, adopted or implemented in the school [Yeah, so…] effectively?

 

[Anita] That just happened this spring and I don’t think it has been yet. Hum… they’re still also like an early, you know like an early stage company so they’re, I think they’re still growing and figuring out exactly what it looks like. But I think that we are trying to support companies in that way. And we’re still figuring that out. So…

 

[Lucas] And was there ever hum… a big problem in a pilot?

 

[Anita] Yeah, let me think… typically I would say the problems that we run into in a pilot is where, companies are like working with their teachers and it’s going well but then sometimes companies get really… I guess it depends, now that I think about it. In the fall of last year the was one where the company like, the developers got really busy cause they’re just, start-up just took off. And so they became pretty unresponsive with our teachers. The teachers like, emailing me, and I’m like trying to get in contact with it, and so typically when there’s not communication between these parties, it would… the pilot would not be as successful as it could because they weren’t communicating. Things weren’t changing. Hum… In the spring, one of the things… The biggest challenge we found was actually testing. So testing was happening for the first time for Common Core and so what would happen was these teachers that email me, being like “I can’t get a hold of the Chrome Book carts”. Like, they just couldn’t get access to the technology they needed to run their pilot. And so… one teacher… her district told her this before she like committed to the pilot. And she just like pulled out. Like she’s like “I just can’t do this” like “I don’t have access to these, to like, the technology that I need”. Hum… But some other teachers, they were like, one of them told me she had to like had to go to the principle and like beg to use the Chromebooks on a like… on a day that they weren’t being used, but, I think because it wa the first year of testing, a lot of schools and a lot of districts were very, hum, protective of their technology cause they just wanted to make sure it went smoothly. And that totally makes sense. And so… for… because the… the testing when it… kind of… varied like when this would happen for the different schools but, some schools were more extreme in like saying, you know, were just gonna use it for all this quarter… like we… like, you know, we’re gonna lock it up and then others were like “Well… we’re not testing now so feel free to use it” So… That was a big challenge in our pilot this spring.  

 

41:55

 

Interview 2

 

[James] So, do you, do you se-, do you see other people sort of coming in and filling that spot? Um bes- I mean, iHub, right?

[Anita] Yeah?

[James] Um, has anyone else tried to do that or…?

[Anita] Yeah, actually, that’s a good question. That makes me think of something else. Uh, the county sometimes does it. 

[James] Mhm.

[Anita] So especially in California where there’s small districts, uh, there’s a district in Santa Clara County that has like two schools.

[James] Mhm?

[Anita] There’s a couple districts in San Mateo County that have three schools, and so this is like, you know two elementary schools, one middle school? They, the county, can be assisting in kind of helping develop tech plans. Maybe not per se rolling out of… it may not be rolling out of specific technology but they kind of help support infrastructure. They may also help with, let’s say if three districts in a county want to purchase a certain product, and they’re really small districts, so like, so their total makes like eight schools?

[James] (laughs)

[Anita] Right? The county can kind of help facilitate a purchasing plan with the other schools, so that way prices are more fair for the, that, those schools.

[James] Yeah.

[Anita] So I guess the county does sometimes play a role, but it depends on the county. It depends on the county leadership too.

[James] Mhm, and have you noticed um, how well they do?

[Anita] So, I think San Mateo is one of the counties that does well in this. Uh, I know that they’ve had some, they’ve definitely assessed their schools in San Mateo County two years ago for tech and how in-, the infrastructure is. I actually have a website that I can share with you about that. But it tells you like the ratio of how many like IT personnel there is to students, like but it doesn’t say anything about software. I think it’s simply in tech- tech adoption, it’s simply like the infrastructural side, not like software.

[James] Yeah.

[Anita] But that’s important, right?

[James] Yeah.

[Anita] You can’t have that without, you can’t have software without hardware, so, you know.

[James] Yeah, no, that is very true. Um, and so in, in cases like, like San Mateo, um what do you think iHub like sort of adds to the mix then?

[Anita] Yeah, so, I think because San Mateo County isn’t per se te-testing software,

[James] Mhm.

[Anita] Our goal is really to help support software implementation. 

[James] Mm.

[Anita] And seeing, you know, what works in software, what works in edtech that way, um, I think they’ve done a lot of the, the other research. And also, I think it’s changed a lot. Like two years in the tech, edtech world is a long time.

[James] Yeah.

[Anita] Like two years ago, it was, it, the landscape looked different, like Khan Academy like different. Some of these startups don’t exist, right? So, or maybe they did and they folded.

[James] (laughs)

[Anita] Like there was something, Amplify, I think?

[James] Oh,

[Anita] So, =

[James] Joel Klein.

[Anita] = So, so I think there’s a lot of like change in the world? So, I think that’s a big, I think you have to re-, continually assess in order to in order to have like a better read. So I think, I have, helps in, it can help in supporting the county. We’re trying to create like a systematic way to like do that, I guess, is assess kind of the edtech side infrastructure but also create a model so piloting of edtech, especially new edtech, is easier, and then there’s a route that’s more clear-

[James] Mm.

[Anita] -to the question for what works and what doesn’t.

[James] So, sort of, um, so would you say, so just to repeat, sort of like to rehash-

[Anita] Mhm?

[James] -what you’re saying, it’s, you’re sort of setting like this, um, like the front runner, right? You’re setting like this sort of example?

[Anita] That’s the goal. I think is to some sort of model that you can follow, like implement, like a flow chart almost. 

[James] Mhm.

[Anita] But we know that school districts are different, so there probably will be some choices or wiggle room in some of these decisions. But I think that’s the goal.

[James] Mhm. And, um, and you may have already touched on this. So what do you think is the, (pause) what DO you think is going to be like the iHub sort of like place in the world?

[Anita] I think the research part is really important. So I think school districts can always fund a lot of the research, and I think if we, we now have a process for matching and school support. But, I think the research cycle really brings it all together, so if we are able to create a strong research process-

[James] Mhm?

[Anita]  -uh, then it will be able to, schools will be able to kind of use the research process and say like, “This works, we should use this. This doesn’t work. This is why.” Give them feedback, hopefully they’ll change, the companies will change.

[James] Mhm.

[Anita] And then, move forward.

[James] Yeah. So how, how do you see that sort of unfolding?

[Anita] How do what?

[James] (in a clearer voice) How do you see that unfolding?

[Anita] Yeah, that’s a great question.

[James] (laughs)

[Anita] That research side is always the side that EVERYONE in this field struggles with. Um.

[James] Mhm?

[Anita] I think right now, there’s more and more literature on it, so we kind of start from there. We also work with different partners, so we’re kind of thinking about, uh, I know another group is doing design-based implementation research, so DBIR research. Uh, but it’s kind of the goal that everyone in the group—so the teachers, the adaptive helpers, the students—everyone plays a role in kind of designing, kind of giving feedback. It’s like implemented in the classroom, but they kind of altogether give feedback so that, over time, the product gets better.

[James] Mhm.

[Anita] Um, but in the world of research, (3) I think we’re kind of, WE are kind of on the exploratory research side slash design slash implementation side, so we’re like earlier. And so I think, we’re, we’re still learning a lot about the field-

[James] (laughs)

[Anita] -about what that looks like. So we have things in place, but I think we’re trying to make them more robust.

[James] Mhm. (in a softer voice) Very cool. (in a louder voice) Um, and so, we went rogue for a little while there. Uh, so, let me backtrack a little bit. What do you think, what do you think is sort of the ideal relationship between um, edtech company and school or educator?

[Anita] Hm, that’s an interesting question. I mean in my head an edtech vendor is a provider, right? So they should be providing some service that fits a need that a school has or a teacher has or a student has in some way.

[James] Mhm.

[Anita] So that’s how I imagine the relationship is, is that they’re providing something to the student. But at the same time, obviously, that providing something, it, it’s a ben-, there’s some benefit to the student, or teacher, or classroom that it brings (…) efficiency? Right? It could be classroom efficiency. It could also be like differentiating or like being able to adapt to each person where they’re at in the classroom. Um, but I think there has to be some sort of benefit to it. 

[James] Mhm. 

[Anita] Yeah.

[James] So that’s from the um, that’s from the edtech company to the educator.

[Anita] Yeah.

[James] And what about vice versa? 

[Anita] I think in my head, it brings I think it helps, I think it just, I think they’re able to give feedback? I don’t know, I never thought about that, it, as much that way. But I imagine that if a product is doing well, then it also provides, like over time, it’ll provide feedback, and that product will continue to get better, and it will continue also to grow in usage around the area.

[James] Mhm.

[Anita] Where, not necessarily around this regional area, but in the area that it’s being used.

[James] Mhm. Um, and would you say sort of, I mean, so iHub, your, your core mission, right, um is to, your value proposition was to, you know, sort of facilitate this interaction=

[Anita] Mhm?

[James] = Uh, do you think you, how, how would you want to like, I guess, how would you want to facilitate that ideal relationship, um?

[Anita] Yeah, so I think there are ways that we’re still working on to figure out exactly what that looks like, especially thinking like five years in the future? 

[James] Uh huh?

[Anita] But for now, I think no one kind of facilitates these relationships so we take the place to do that.

[James] Mhm.

[Anita] Uh I think in like ten years, ideally, we wouldn’t have to do that because schools and districts would be doing that internally, right. 

[James] Yeah.

[Anita] They would be able to set aside part of their budget to pilot products, not to pay the products, but maybe to pay the teachers and, or, maybe they don’t even, like, it’s part of the integral process of how you’re teaching so it’s related to the professional learning that happens.

[James] Mhm.

[Anita] In school. Um, and then they would use data collected from these pilots as decision points for whether or not to purchase the product, and then if they don’t purchase the product, or even if they do, kind of give that feedback to companies so that companies can change their product to be more appropriate for the education world. 

[James] Mm. Can you elaborate a little bit more on that, actually? It’s uh…

[Anita] Yeah, so I think we just want to make sure the, the products are really relevant to students, right. 

[James] Mm.

[Anita] And so, that’s the way we do it, is that you get feedback from students and teachers, but I think those needs change, right? Each year, this year, the needs are different than last year, because this year you’re using Common Core, and last year, maybe, it wasn’t as big of a, actually it was really big last year.

[James] (laughs)

[Anita] Maybe two years ago wasn’t the same, right?

[James] Mhm?

[Anita] So I think that that’s a big…

[James] Big?

[Anita] Difference. Yeah. 

[James] Yeah.

[Anita] So.

[James] Um, and is there other things that you, is there um, anything you would either do, well, what would you want to keep the same or would you want to do differently, or would you want to sort of sustain? Do more of?

[Anita] Yeah, I think there’s a lot of things that are good right now for matching process. I think that it’s really helpful that we have lots of connections to districts, so I think that we need to continue to maintain those relationships, but also continue to grow them.

[James] Mhm?

[Anita] Uh, I think starting with the problem of practice. So having teachers kind of come in with the need they want a product to use to fill-

[James] Mhm.

[Anita] -um, is important. But I think maybe something to change on that front is also how you help them define that problem of practice, because I think some teachers come in and say like, “We really want differentiation lists in math in third grade.” But then when they finally see the companies that are selected, they’re like, “Oh wait, we really want to do something else.” And it’s like, was that really a need of yours? Or were you just kind of saying that because it sounds like a need that everyone’s talking about?

[James] Yeah.

[Anita] Um, and so I think helping teachers really focus on a problem of practice, that’s something that we’re learning to work on, but (clears throat) this year at least, it was at least stated, versus in the past, it wasn’t even stated at all.

[James] (laughs)

[Anita] So continuing to going, going down that path is really important-

[James] Mhm.

[Anita] -in the matching process slash vetting process. Um, I think something that has been good especially in the Bay Area is working with early stage companies =

[James] Mm.

[Anita] =and so we work with early stage companies to, you know, it, it’s a good place to be for that. So I think for us, that’s a really good niche.

[James] Mhm.

[Anita] Um, but I do think as time goes on, something that needs to kind of change in the work is that we have to support both early stage companies but also like mid, like later stage companies, so that you know, teachers change their practice or you know, like, it, is it really affecting students if it’s in ten classrooms, right? Not really.

[James] (laughs)

[Anita] So like, I mean, it does, but, you know. It could have a wider, wider effect if there are more, if there are more, it was in more, if it’d shown that it actually should be in more. 

[James] Mhm.

[Anita] So. Other things, I think research similarly like, we have some protocol, some usability research, but I think it would nice to step into a little, especially for later stage companies, how do you help with maybe, more specifically efficacy research? Which is how well or how well this product’s meeting a need that it said it’s meeting, that it said it’s trying to fix.

[James] Mhm.

[Anita] So, (in a very soft voice) that’s one, (in a slightly louder voice) one thing I guess. 

[James] Yeah.

[Anita] Mhm.

[James] So that, this is actually quite interesting the um, I guess for me, the, the idea of you know, early versus late stage, right?

[Anita] Mhm?

[James] Um you, you brought up that that’s sort of, you see that as your, as your niche, right? Is the early stage. 

[Anita] Yeah.

[James] Um and I can, I can guess as to why, but can you, can you tell me a little bit more?

[Anita] I think one of the big challenges is, in edtech, it’s like there’s so many edtech companies so it’s how do you kind of bring to the surface the ones that are promising? So, I think our goal in vetting the companies is to bring to the surface some of the more promising early, like, edtech companies and kind of help them go from early to mid. I think there’s a big jump from those two and some people don’t (laughs) don’t make it. 

[James] (laughs)

[Anita] Actually a lot of people don’t make it. =

[James] (while laughing) A lot of people don’t make it. 

[Anita] =Yeah, a lot of people don’t make it. 

[James] (laughs)

[Anita] Most. So.

[James] (while laughing) Yeah. 

[Anita] I think that’s the goal.

[James] Yeah. 

[Anita] Yeah.

[James] And, and so when you, when you call your, your niche, that sort of implies like a competitive advantage, right? Um for iHub=

[Anita] Mhm.

[James] =specifically? Um and so, um yeah, I mean, yeah. Could you elaborate more about this, that, that idea?

[Anita] Yeah, and I think, well I think that’s mostly because right now, schools DO need edtech products. Like they des-, they want it. They want products that do x, y, z. Um but they don’t really know how to go about and find them. So I think that that’s why we’re working with early stage companies because I think it’s, it’s possible to find one now that meets the needs of many teachers and kind of help it kind of just move along.

[James] Yeah.

[Anita] And it’s quote unquote adoption learning. Which, I use that word. I don’t really love=

[James] (laughs)

[Anita] =the word “adoption.” I think it has a lot of loaded meanings, but.

[James] Um (laughs).

[Anita] Yeah. 

[James] And, and wh- why, why do you think iHub is uniquely sort of in the position to do, you know, to like really understand that?

[Anita] Mhm? I think it helps that with a lot of partnerships we’ve previously formed, I also think that since we’re neutral, we’re not a school, we’re not an edtech company. I think that that puts us in a position to facilitate those relationships well. 

[James] Mhm.

[Anita] Uh I think if we were a school, then we’d be constantly thinking about like, “how much does it cost?” Like, uh other things, that I feel like schools HAVE to think about. 

[James] Yeah.

[Anita] Which I mean, are very important. We also keep those in our head when we’re recruiting, but I think it also gives us some neutrality, I think, as well, so. And with, the other side is that we’re not really affiliated with edtech venture funds, or like incubators, right. We have partnerships with them, but we’re not like soliciting. Or we’re not trying to make a sale, so school districts are more willing to work with us because we’re not like, “You have to use this product because we’re going to like make money from the fact that you use this product.” =

[James] Mhm.

[Anita] =It’s just like, “Oh, from the tests that WE did, and the research that we, research that we’ve done with other teachers, they really enjoy this product specifically for these things.” So, yeah.

[James] Um, and do you think, do you think, or how hard or easy or whatever do you think would be for, not a competitor, but like another sort of um, iHub model to come up and sort of, you know, also add to, add to the ecosystem?

[Anita] Yeah, so, the compa-, the other groups that we work with kind of do similar tests they run. We call them test beds. They have similar test beds. But the three that have been funded so far, we focus on early stage, I would say, iZone kind of focuses on design implementation research, and then Leap focuses on impact or efficacy research-

[James] Mhm.

[Anita] -so we kind of do have similar people in the space, but not here in the Bay Area.

[James] Yeah.

[Anita] I do think there are more and more coming up. I think (3) it would, I mean, it’s good to have more people doing research about this because no one knows how to do it well. 

[James] Mhm.

[Anita] So, I think that would be a good, it would be good in some ways, obviously. And then, obviously, for, in other ways it would be more competitive for us.  

 

36:54:04

[Ana] We’ve been talking a lot about the opportunities for educators to give feedback back [yeah] to the startups to improve their products. Are there any opportunities for the entrepreneurs and the educators that are involved in this partnership to give feedback back to SVEF?

[Anita] Oh yeah. There are. I never mentioned those, but there are lots. Every time we have meetings, we are very…open about that. And I also think, we have surveys, so there’s a lot of, we send out a lot of surveys about a lot of…different, specific, different…happenings, and so after…orientation happened, there was a survey that was sent out about that. After the Pitch Games happened, there was a survey about that…I also think that during the rounds if we have strong relationships with teachers, which is typically the case, then teachers are very open with us. At the end, I’ll be like, you know, we’d love to hear your feedback, and they’ll just tell us, you know, we’d love it if there was this, this, this, this. And that’s been helpful, and we’ve made a lot of those changes based on teacher feedback. Like for example, the reason why teacher teams are at a school site this year instead of from all different schools is partly ’cause it makes sense, I think, to scale, but also because we that was one of the big pieces of feedback that was given from the beginning. So.

38:15:00

[Ana] Great, that’s awesome. Um, another question that comes to mind is, you mentioned that, uh, you’re working with earlier stage companies as opposed to [yeah] later stage companies or startups. Um, when you think about how that impacts, or how that affects the actual impact of a product in a classroom (two-second pause), what comes to mind?

[Anita] I don’t know if I understand=

[Ana] Yeah, let me totally rephrase that. (observer sneezes) Bless you. Do you think that working with earlier stage startups, as opposed to later stage startups, impacts the, or affects the impact that a company can have on learning in a classroom?

39:10:08

[Anita] Uh. Could. I mean, it depends right ’cause if (pauses for three seconds) yeah and no, I mean, it depends…if your thinking about it like with time or if you’re thinking about it like short term or long term I guess. So one of the organizations that we work, a different (inaud) does this work with mature companies. And so what they do is they work with schools who have already kind of been using specific products in the classroom, and then they do very specific research using…data points and observation and kind of tells these schools…yes, this products works or…no, we don’t really think this product works…And I think that’s important for schools to know whether or not they’re paying for something that doesn’t bring any learner outcomes, right…or isn’t helping their teachers…adjust to 21st century learning or you know, just changing the way they teach…I think for us the goal is that, you know, we kind of help with this, this market where it’s a little, it’s not very defined…no one is really guiding these people. So they just come up with an idea, and they just kind of throw it out. And if it works, that’s great, and if it doesn’t, then not. But I think we’re kind of hoping to pull out some of those that work. But I think ours, the goal would be like it’s a longer term. You would find out over long term if it works versus something that’s more like yes that works or no that doesn’t work right away. So, yeah.

[Ana] Do you think there are any negative repercussions of trying out products that are so early stage on real learners?

[Anita] Yeah, good question. I think that it’s definitely a possibility. I would say that, I would say that I think teachers who we pick, we try to pick ones who are very…very experienced with using tech in the classroom and so I think that you, you find that teachers who use tech in the classroom, you…it’s like their instruction is different and so you know, depending on, I think they can make learning happen with almost, with different, different pieces. And so I think that’s one way we kind of counter from it. But it is true. But I think it’s like would you rather have teachers do that without any oversight as to whether or not that works? Or would you rather have them do it with some facilitation as to whether or not it actually, there’s some…you know, conclusion at the end like yes, it works or yes, it doesn’t. I think teachers sometimes already do that in the classroom. So. Yeah.

41:45:01

[Ana] Interesting. (eleven-second pause) I think that it’s interesting that you said that ideally in ten years, an organization like SVEF would be out of business=

[Anita] Well, I would say that the iHub Program. (both nod) Yeah. We do a lot of other shit. But=

[Ana] Yeah.

[Anita] I don’t know if that’s…an organizational goal. I think that would make the most sense, right, ’cause I think, in my head, if you…if you identified a problem and you’re able to solve it…that’s great. (laughs)

[Ana] Yeah. Do you, do you think that’s actually possible?=

[Anita] Going to happen? I don’t know. I think it’s hard to say because I would say I don’t know enough about school districts and about school…counties, offices, to be able to know whether or not they’re functional. There’s a lot of bureaucracy, I think, that comes up when you work with the county and work with…there’s so many different needs and so many different people kind of working on it that sometimes…they can’t, they’re unable to kind of do certain actions because of different reasons, whatever they are. So I don’t know. But that would be like, in an ideal world to be able to give a model to a school district, to any school district and be like if you kind of adopt this to your school…this is a way you could pick technology for your classroom, classrooms, and also give feedback to these developers, and then developers would also have a clear path to entry, which I think is a big issue in the market.

 

Tech 4 Learners – Final – Reflection on Design Project

1. Product/prototype

Our target learner was Achu, a calm and smiley 12 year old boy who is fond of playing basketball, watching HotWheels videos on YouTube and painting. He follows instructions well yet rarely initiates activities on his own. That is also the case with communicating with others, unless he needs to go to the bathroom or needs more paint for example. He responds to questions but is not always sure about his answer. He often repeats the last words heard when answering. Our impression was that he knows the answer yet has trouble externalizing it appropriately.

We immediately focused on the idea of helping Achu initiate verbal communication in so far as it would help him express his desires and needs more effectively. Our initial brainstorms revolved around using technology such as VR and games that would prompt him for verbal responses or would require verbal input to be utilized. We generated a few statements that helped us focus on the learner’s needs and the solution:

How Might We

  • HMW help him say more words?
  • HMW motivate him to want to communicate?
  • HMW stimulate him to produce original words?
  • HMW make him comfortable sharing words with others?
  • HMW make him feel like his words have value?

This lead us to the following Needs Statement:

“Achu is a shy pleaser who needs to practice creating his own words in order to facilitate him communicating with others.”

To achieve that, we created a low-resolution prototype which consisted of playing a video with no sound on the laptop and prompting him to narrate what was going on. The final goal was to have a video with his voice narrating the events. We were able to engage him in the activity and on a few occasions, he actually generated new words, when prompted. We felt that the prototyped achieved some of the initial goals but there was still something missing to be considered truly effective.

After this initial test, we were able to get feedback from Marina. She thought the prototype worked but partially because narration is a technique that has been used extensively before by his speech therapists. He generated new words but still needed prompting from us. She also stimulated us to think more about how could he transfer what he learned within our product to his everyday life. With this in mind we evolved our learning goal to:

“We want Achu to learn the value of communicating with others.”

After presenting our finding from our initial prototype, we dug deeper into what was missing and discussed some more potential solutions. We finally connected the idea that the value of communication is shown more evidently when helping others. We could use a teachable agent in the product and elicit the Protégé Effect (Chase, Chin, Oppezzo, & Schwartz, 2009). We built on the idea that while Achu might not find it always natural to speak for himself, he might find it compelling to speak to someone to help them.

We introduced Tom in our prototype. A blind cat who asked Achu for help figuring out what was on the screen. We scaffolded the experience by creating a simple learning progression. We start with a single word on the screen. Tom asks Achu what is the word. Once Achu says the word, Tom thanks for Achu’s help. We wanted to ensure that we were ‘valuing the process and not only the final result’ (Dweck, 2007). After 3 words, we moved on to 3 short sentences, 3 pictures, and finally 3 videos.

On the day of the test we were unsure about the results and therefore also took a few other activities to gauge Achu’s engagement and levels of communication we could elicit from him.

We had him play with an App that records what you say and plays it back with a funny voice through a character. He was soon bored with the activity.

We moved on to observing him assembling a jigsaw puzzle with a phrase instead of a picture as the complete set. He was very fast at combining the scattered words into a perfect sentence. He was also prompted to read it out loud, which he did with ease with the exception for one word he did know how it sounded. He seemed embarrassed but was reassured by the teacher that it was ok to say that he did not know – which he finally did. This episode showed us that the Protege Effect might actually work on him since he would not want his ‘friend’ to not know something.

Our final activity prior to testing our prototype was to engage him with text messaging. He clearly understood what was going on and responded by typing onto Alex’s phone while I was in another room with the other phone. His trouble was dealing with the small keyboard on the phone but it showed promise in that he might engage well with this form of communication with a larger keyboard.

Finally we tested our prototype. Achu was immediately fond of Tom, the cat and rapidly replied to his prompts. The words, sentences, and pictures we verbalized promptly. The video also succeeded in promoting verbalization yet it took some more time for him to think about what to say. Once he did it and Tom thanked him, his energetically and positive reaction was priceless and strong evidence that the Protege Effect worked. He even clapped his hands and said “Achu is helping the cat!”. 

What surprised me the most during the process was how a small adjustment in the product resulted in such a big change in the levels of engagement. The process of narration was still the same, yet the purpose and motivation it was made clear to him. Narration for narrations sake did not have value for him. Helping Tom did. It also reminded me that we were eliciting in a small way Joint Media Engagement (Takeuchi & Stevens 2011) between Achu and Tom the cat. They were both consuming media and helping each other out – feeding off of each other – learning from each other. A lesson learned that I will carry onto all my future design processes.

One thing I felt was missing in the process was a greater level of engagement with Achu’s teachers, Marina, and eventually his parents. To fault was lack of time, schedule conflicts and few attempts on our part to communicate more frequently with the stakeholders. Yet for the purposes of the course and the learning process, the interactions were fruitful and thought provoking leading always to new iterations and fine tuning of the product.

2. Collaboration 

The collaboration within our team was effective. I assumed the creative and technical role while Soren looked at our product through a more pedagogical lens and Alex with the documentation and write ups. It was a fruitful process where I felt each one in the group contributed effectively and pulled their own weight throughout. My multimedia skills helped us to rapidly create the prototypes, presentations and video. Soren’s teaching background helped us selecting the appropriate language, level of complexity, and scaffolds towards learning. Alex helped us with summarizing and documenting our meetings, tests, and findings.

Our process was very much guided by our class activities. We met only twice outside of class, not counting our three visits at OMS. This does not mean that we did not communicate outside of class. Through Google Docs we constantly collaborated with the elaboration of the presentations, texts and ideas. This demonstrated the effectiveness of the scaffolds we received as designers from our professor as well as our groups efficiency to generate ideas and agree with the path to take.

Next time around I will certainly work again with all the collaborative digital tools we used to document and brainstorm our ideas. I will also take the lead in creating the multimedia content since it is something I enjoy doing and see how valuable it is. As far as doing things differently, I would only wish to have more time to interact with the stakeholders and the learner. I will push harder to communicate more effectively with the intended audience and try to get more insights as to what the learner’s needs are.

3. Learning

My learning experience during the project was more one of trying to apply learning theories to the project than trying to be overly creative, as was the case in some previous projects I’ve worked on. The challenge was to design for a learner which we knew very little about, but using the educational lens we were able to apply and test learning theories with a certain success. A core motivator for me was  actually a little bit of the Protege Effect mixed with the Four-Phase Model of Interest Development. Looking back at the quarter, I noticed that much of my effort towards creating a better product was drawn from wanting to please the other, to teach, and to provide a benefit to his life. Not to mention the desire to please the teacher as well in the process. As for the interest development aspect of learning, I feel I reached Emerging Individual Interest, close to Well Developed Individual Interest – depending on if I am able to evolve the product in the future.

More importantly I believe I improved my skills and techniques of rapid prototyping. The pressures of creating a functional prototype to be placed in the unguided hands of a user were removed by the “Wizard of Oz” technique. It allowed me to create more freely and rapidly. It allowed me to continue thinking freely about potential solutions instead of being vested on a product because of all the time I spent in detailing a quasi-product. Yet I also learned that being able to design this way also requires some previous experience with prototyping. You must be able to predict user’s interactions that might completely break the desired effect. Therefore, even in a free-formed rapid prototype has a Minimal Viable Product.

References

Chase, C. C., Chin, D. B., Oppezzo, M. A., & Schwartz, D. L. (2009). Teachable agents and the protégé effect: Increasing the effort towards learning. Journal of Science Education and Technology, 18(4), 334-352.

Dweck, C. S. (2007). The perils and promises of praise. Kaleidoscope, Contemporary and Classic Readings in Education, 12.

Hidi, S., & Renninger, K. A. (2006). The four-phase model of interest development. Educational psychologist, 41(2), 111-127.

Takeuchi, L., & Stevens, R. (2011). The new coviewing: Designing for learning through joint media engagement. In New York, NY: The Joan Ganz Cooney Center at Sesame Workshop.

Tech 4 Learners – Final – Notes

National Education Technology Plan

  • Focus on technology but need to use it for PD

Understanding by Design

  • Backwards design or backwards planning
  • Clear learning objectives
  • How could we incorporate game design practices into education?

Computer Criticism vs. Technocentric Thinking

  • Ed Tech is not the silver bullet – must come with pedagogy and PD

In-Game, In-Room, In-World

  • Kids learn plenty from each other
  • Kidification of education

The Perils and Promises of Praise

  • Growth mindset
  • Constructive praise – effort and process not ability itself (you’re so smart!)

Four-Phase Model of Interest Development

  • Model
    • Triggered Situational Interest
    • Maintained Situational Interest
    • Emerging Individual Interest
    • Well Developed Individual Interest
  • Teacher’s interest is probably best predictor of effective teaching
  • Teacher’s role is to provide:
    • Positive feelings
    • Generate curiosity
    • Provide opportunities
    • Guide on research

The New Coviewing

  • Joint Media Engagement
  • Design Guide
    • Mutual engagement
    • Dialogue inquiry
    • Co-creation
    • Boundary crossing
    • Intention to develop
    • Focus on content, not control
  • Challenges
    • Parents too busy
    • Parents unaware of needs
    • Don’t enjoy the same content
    • Desired interactions not always triggered
    • Little continuity into other family activities
    • Distraction are always present
  • Design principles
    • Kid driven
    • Multiple plains of engagement
    • Differentiation of roles
    • Scaffolds to scaffold
    • Trans media storytelling
    • Co-creation
    • Fit
  • “What goes on between people around media can be as important as what is designed into the media”

Teachable Agents and the Protégé Effect

  • Care more about pleasing others than oneself, so having someone you need to help enhances learning through teaching this person

Tangible Bits: Beyond Pixels

  • Tangible User Interfaces

Horizon Reports

  • re-teaching our teachers how and what to teach