Driving the process of innovation발음듣기
Driving the process of innovation
Driving the process of innovation
When you're in a situation where there's just a ton of certainty, it's actually pretty easy to do strategic thinking.
You look at the variables, you make a plan, and you implement it.
It's like if want to cook a dish you know how to cook, get the recipe, buy the ingredients, cook it, you're out.
But a new chef creating a brand new dish has a totally different process.
That chef might start by getting inspiration from what other chefs have done and then using their expertise to assemble a bunch of interesting ingredients that might work well together.
And then trying it and tasting it and getting lots of feedback from other chefs.
And then, essentially, iterating their way to a successfully brand new dish that they didn't even have a conception of before they started.
And so redesigning whole classrooms and schools for this new blended-learning world is a lot more like that chef creating that brand new dish than you or I just simply following directions and roasting a chicken.
There's just no playbook about how to do this.
And, to be frank, a lot of the software and hardware ingredients aren't all there yet.
But this does not mean you can just be wildly experimental and throw it up to saying, "Oh, Michael and Brian told me to iterate." Right?
It's more like the scientific process.
You create a thoughtful hypothesis and then you figure out a way to test it in the quickest and easiest way possible.
And then based on that data, you pursue one of two paths: either more of what you've been doing or less.
It's a rigorous thinking process. [Michael] There's several bodies of thought around how to test, learn, and adjust as you build a solution to a problem: design thinking, lean startup methodology, and discovery-driven planning.
These are each important topics in their own right.
But here, just for time sake, we basically combine then into a survey of these ideas to help give you a process for how you're going to innovate in your own context.
When you're doing something new in education but that's relatively similar to what you've done before, the process is actually pretty straight forward.
Just think about purchasing a textbook.
You get a group of people together, you evaluate your options, then you pick the textbook, and then you roll it out in the classroom like you've always done.
But if it's uncertain conditions, it's totally different.
Think about Steve Jobs and Apple launching the first iPod.
If they had taken the playbook from Sony on launching the Walkman cassette player, they would have missed huge opportunities that this new model allowed for them.
Things like the iTunes Store or even the phenomenon of streaming music or the marketing idea of 40,000 songs in your pocket.
Exactly. And Summit Public Schools, one of our protagonists, is a lot closer to Apple than the textbook.
When they were implementing a model with playlists, that's something that really had not been done before in education.
So, what a change in iteration usually looks like in most schools, and certainly the schools that I worked in, is that every year in August, we would launch something new.
And it's something that we worked over the summer to plan for and create.
And whether it's a teacher in their classroom or a school, we'd launch it.
And then we'd run it for the entire year.
And we wouldn't change anything during the year.
And, honestly, we wouldn't really gather data about how it was working or not.
Some time in the spring, as we were thinking about the next fall and staffing and budget, we'd be like, "Should we keep doing that?
Yes, no, maybe? You know, whoever was at the table, it was sort of arbitrary how it'd be decided.
And that's kind of what happened for the next iteration.
It wasn't disciplined, it wasn't focused. It wasn't data-driven.
So, we knew we needed a different process, so we found and adopted "The Lean Startup" by Eric Ries as an iterative process for what he calls the build, measure, learn cycle.
Where you actually build something, but you're intentional about your build.
What problem are you trying to solve?
You in advance say, "What am I going to learn from this?"
How am I going to measure if it's successful?
And then you actually intentionally gather that data, measure it, learn from it, and then you don't just go do it again or not.
You actually iterate on what you've done based on that measurement and that learning.
So when you're implementing something in education that's totally new and unfamiliar, we suggest a framework that has six key steps.
The first step is to get clear on your objectives.
You need to know your desired goals.
[Brian] Then figure out how you're going to measure results.
How will you know if you're succeeding or failing?
And what data will you need to be measuring?
[Michael] Step three is to commit to action.
The learning happens by doing. [Brian] So you create mini-tests that allow you to figure out what is or is not working.
[Michael] And then you collect feedback from the students and teachers involved.
[Brian] And lastly, you keep iterating your way to success by doing more of what's working and tweaking what's not.
Being clear on your objectives is essential so you're not floundering your way through this process.
And remember, your objective is not about technology itself.
You're not just saying, "My goal is "to have iPads in the classroom." Because that's self-referential.
It's always about the learning you're trying to create and the way that you think technology will help you get there.
In step two, figure out how you're going to measure your results, and figure out what your data that you're actually going to use so that you know if you're being successful or not.
Now, when we talk about data, we don't just mean test scores.
You can use those, but you can also use things like student engagement, how much time are they spending with teachers.
Factors like that that lead you to understand if your model is actually being successful.
Committing to action is really important because this can't just be a theoretical exercise.
It's like a chef when they get their ingredients together, at some point they have to put it in the pan and see what happens when you introduce heat and how the flavors meld together.
And in education, that happens when you put this in front of real students.
You have to be willing to get your hands dirty and try these ideas, otherwise it's just all an exercise of thinking on paper.
So, roll up the sleeves and try this work as part of committing to action.
The important thing though is that as you try out these ideas, you're not doing so in these big, high-stakes rollouts that have just huge risk if you get it wrong out of the gate.
Instead, you want to really create mini-tests in step four that are just really cheap, low-cost ways to test out ideas and prototype really rapidly.
And what a lot of people call a minimum viable product or an MVP.
Basically a way that allows you to test really quickly whether something's going to work so that you have time to iterate.
It's like if you're designing a new phone interface.
You don't build the whole software code and release it to the world.
You might start by taking your phone and just sticking a post-it note on it and drawing the sort of experiences that you want the users to have.
And then you can tear through post-it notes Before you even write a line of code, you can learn a huge amount.
One example of how we learned that prototyping is so important to the innovation process is when we didn't do it, actually.
It's when we decided to go through every single step and get ready to release a new tool for our students without going through and examining what it would look like for one student.
And halfway through Thanksgiving break, 30 hours into the project, two people are on the phone and we're like, "It doesn't show up "on the student's computer and I don't know why.
The links aren't working and I'm not sure why.
So then we had to stop; we had already created something for two hundred students at that point and we had to go back every single step and see where we went wrong, and in those hours of re-correction, it was a lesson over and over again.
Great, this is why you do small batch testing.
This is why you have to prototype right away because you are able to get a tool into the hands of users as fast as possible.
And then you can iterate from there on out, instead of having to spend any time fixing what doesn't make sense.
Typically in education, when we try something new, it's a multi-year process with huge planning teams.
And we spend a ton of money and do all these things and then finally give it to students to see if it even is a good idea or not.
And it's almost the opposite of an MVP.
And in comparison, there's a school that we support at Silicon Schools called Caliber Schools and they took a much more MVP approach.
They decided to start a summer prototype to try out their ideas before they even opened their school.
Caliber wanted to test out what kinds of support students would need to be successful in their model.
Or whether students without much computer experience could be taught to code, or what kind of teachers could be most successful in their model.
By building a real laboratory, they could test all these ideas out in-person versus just a hypothetical argument on paper.
And just think about how much easier this is to do when you're doing it in a summer school outside of the normal school environment.
Just as the theory proves true or doesn't prove true, you can make adjustments much more easily.
And after school is actually a very similar space for you to be able to try these sorts of ideas out and have that freedom to iterate.
And when you're in this prototyping stage and you're coming up with new ideas, remember how important it is to get lots of different people as part of this process, so you can actually think outside of the box.
The beauty about prototyping is that you don't need that many tools.
You just need a mindset and you need a process.
And so really, the way we started prototyping is we took butcher paper and we just threw up slabs of butcher paper on our wall and we said, "Identify the problem.
Look, anything that bothers you, just write it up here.
And anyone who has a solution to it, do the same."
Take a marker, write, go, use your brains, be creative.
Let's be a team. And then, as we saw the problems formulating, then we just made mini teams."
And we thought through different problems, came up with different proposals, brought it back to the team, and enacted it within a week.
So, that was a really powerful way for us to ensure that the teachers were the drivers of innovation.
The teachers were a part of the design process because they were the ones who best able to identify the problems right off the bat.
Once you're at the stage of trying these things out, this is where you need to collect data.
And of course, this can be test scores or quantitative feedback, but I also think it's incredibly important to just dedicate the resources to observe what is happening in these classrooms.
It's literally about getting another teacher or a principal or even a video camera if you have no other choice, to really monitor and watch what happens.
Because you can learn a huge amount if you study it closely, but typically in education, we just throw a bunch of stuff on the wall and then later try to remember what we thought worked or didn't work.
So, my best advice is to just commit to having an observer there who can really watch and process with you, because you can learn a huge amount from every one of these trials.
One other thing to that end, which is don't forget about feedback from students themselves.
Summit Public Schools does a great job of this where they use focus groups and regular surveys to collect feedback from students about what is and is not working.
Understanding student voice and understanding student experience, it's not just to kickstart a design process.
It is absolutely the blood that pumps through every aspect of the design process.
You get what students say, then you prototype, and you create an idea, and you put it out there.
Then you have to hear what they have to say again and what they think about it, and that is what you use to then continue to iterate forward.
The student voice is really the engine of the design process, and for people who are nervous about taking a leap into the unknown and are nervous that this might be...
It's nerve-wracking to maybe perhaps straw away from what you're used to and to hurt students is what some people think.
But really at the end of the day, when you put students at the engine of the design process, you can't go wrong because you're always coming back to them and they will hold you accountable at the end of the day, to a higher standard than you ever could hold for yourself.
The reason you need to keep iterating is you will not get this right right out of the box.
None of us are actually smart enough to design the perfect model on paper.
And you need to de-risk this for yourself and allow some failure, but that concept of build, measure, learn will let you keep the sort of virtuous cycle of innovation going and you will get to better results.
So, to understand what iteration looks like in practice let me give you an example.
When we started and launched the idea of a playlist for students who are self-directing their learning, they would go to a playlist and be able to select how they would learn before they would then move on to show what they know.
Our first playlists were sort of based on what you would find on your iPod.
And so, we just put a whole bunch of resources together and gave it to kids.
It didn't accomplish what we wanted the playlists to do.
Honestly, kids weren't learning that way.
And so, we had data that said they weren't learning from the playlists, and, so, that's not what we wanted to accomplish, and we had some ideas about how to improve it.
We took those ideas and we iterated on the first version and tested them.
And said, well, you know what, if we take the playlist and we actually divide it into groups and group the resources around a header that says, here's an objective that you want to learn, and here are some resources around it.
Would that improve? And we did that.
We gathered student feedback and we heard their voices.
We looked at their performance data.
And so, they started to learn a little bit more.
But we learned and then we went through the cycle again and again and again, and each time got better and better.
So today, the playlists are significantly improved. They have introductory sections.
They start with a diagnostic assessment where kids can really see where they are.
They conclude with a direct link to the final assessment.
They have ways where kids can mark what they've already done and keep track of their progress.
And they allow kids to crowdsource their feelings about the playlist.
Did it work for me? Did it not?
And give their peers a whole set of rankings about what's effective or not.
So, that's an example of going through the cycle multiple times using the measurement, the learning, to continuously iterate and improve to a place where we feel really good now.
In school settings, there's actually an added challenge.
Sometimes you have a great theory or a perfectly constructed idea, but when it hits the reality of real schools and real students, it all falls apart.
Maybe the Internet's down one day.
Maybe a student had something traumatic happen at home and they come in and they ruin the lesson for others.
And it's really easy to throw the baby out with the bath water, but sometimes, you just need to do a different iteration or actually stick with something through the difficult stage while you're learning how to do it even more strongly. So, implementation really matters.
And this is where Brian and I would say you really have to trust the gut of actual educators on the ground about when it's worth doubling down on something or when you actually have to step away from something because it's not working.
I have a friend who ran a network of schools here in California, and they tried a really thoughtful pilot of a new piece of software and he did it right.
He had a small group try it, he measured the results.
And they actually got very big gains for their students.
So, then it was clear that the right thing to do was let's scale this.
Let's put this into all our schools.
And they shared the data and they made a plan.
And they rolled it out and like happens, he turned his attention to all of the other parts of the job, and a couple months went by, and they got the data back, and it actually hadn't had very great results.
So the first thing all the other people involved said was, "See, it's not a good piece of software.
It doesn't work. And his response was like, "No, it does work! It didn't work the way we just did it."
Sometimes, it's about sticking with it or going back and looking at the iterations or the implementation to figure out what's the right way to make this work.
When you're in a situation where there's just a ton of certainty, it's actually pretty easy to do strategic thinking.발음듣기
It's like if want to cook a dish you know how to cook, get the recipe, buy the ingredients, cook it, you're out.발음듣기
That chef might start by getting inspiration from what other chefs have done and then using their expertise to assemble a bunch of interesting ingredients that might work well together.발음듣기
And then, essentially, iterating their way to a successfully brand new dish that they didn't even have a conception of before they started.발음듣기
And so redesigning whole classrooms and schools for this new blended-learning world is a lot more like that chef creating that brand new dish than you or I just simply following directions and roasting a chicken.발음듣기
But this does not mean you can just be wildly experimental and throw it up to saying, "Oh, Michael and Brian told me to iterate." Right?발음듣기
You create a thoughtful hypothesis and then you figure out a way to test it in the quickest and easiest way possible.발음듣기
And then based on that data, you pursue one of two paths: either more of what you've been doing or less.발음듣기
It's a rigorous thinking process. [Michael] There's several bodies of thought around how to test, learn, and adjust as you build a solution to a problem: design thinking, lean startup methodology, and discovery-driven planning.발음듣기
But here, just for time sake, we basically combine then into a survey of these ideas to help give you a process for how you're going to innovate in your own context.발음듣기
When you're doing something new in education but that's relatively similar to what you've done before, the process is actually pretty straight forward.발음듣기
You get a group of people together, you evaluate your options, then you pick the textbook, and then you roll it out in the classroom like you've always done.발음듣기
If they had taken the playbook from Sony on launching the Walkman cassette player, they would have missed huge opportunities that this new model allowed for them.발음듣기
Things like the iTunes Store or even the phenomenon of streaming music or the marketing idea of 40,000 songs in your pocket.발음듣기
Exactly. And Summit Public Schools, one of our protagonists, is a lot closer to Apple than the textbook.발음듣기
When they were implementing a model with playlists, that's something that really had not been done before in education.발음듣기
So, what a change in iteration usually looks like in most schools, and certainly the schools that I worked in, is that every year in August, we would launch something new.발음듣기
Some time in the spring, as we were thinking about the next fall and staffing and budget, we'd be like, "Should we keep doing that?발음듣기
Yes, no, maybe? You know, whoever was at the table, it was sort of arbitrary how it'd be decided.발음듣기
So, we knew we needed a different process, so we found and adopted "The Lean Startup" by Eric Ries as an iterative process for what he calls the build, measure, learn cycle.발음듣기
And then you actually intentionally gather that data, measure it, learn from it, and then you don't just go do it again or not.발음듣기
So when you're implementing something in education that's totally new and unfamiliar, we suggest a framework that has six key steps.발음듣기
The learning happens by doing. [Brian] So you create mini-tests that allow you to figure out what is or is not working.발음듣기
[Brian] And lastly, you keep iterating your way to success by doing more of what's working and tweaking what's not.발음듣기
Being clear on your objectives is essential so you're not floundering your way through this process.발음듣기
You're not just saying, "My goal is "to have iPads in the classroom." Because that's self-referential.발음듣기
It's always about the learning you're trying to create and the way that you think technology will help you get there.발음듣기
In step two, figure out how you're going to measure your results, and figure out what your data that you're actually going to use so that you know if you're being successful or not.발음듣기
You can use those, but you can also use things like student engagement, how much time are they spending with teachers.발음듣기
It's like a chef when they get their ingredients together, at some point they have to put it in the pan and see what happens when you introduce heat and how the flavors meld together.발음듣기
You have to be willing to get your hands dirty and try these ideas, otherwise it's just all an exercise of thinking on paper.발음듣기
The important thing though is that as you try out these ideas, you're not doing so in these big, high-stakes rollouts that have just huge risk if you get it wrong out of the gate.발음듣기
Instead, you want to really create mini-tests in step four that are just really cheap, low-cost ways to test out ideas and prototype really rapidly.발음듣기
Basically a way that allows you to test really quickly whether something's going to work so that you have time to iterate.발음듣기
You might start by taking your phone and just sticking a post-it note on it and drawing the sort of experiences that you want the users to have.발음듣기
And then you can tear through post-it notes Before you even write a line of code, you can learn a huge amount.발음듣기
One example of how we learned that prototyping is so important to the innovation process is when we didn't do it, actually.발음듣기
It's when we decided to go through every single step and get ready to release a new tool for our students without going through and examining what it would look like for one student.발음듣기
And halfway through Thanksgiving break, 30 hours into the project, two people are on the phone and we're like, "It doesn't show up "on the student's computer and I don't know why.발음듣기
So then we had to stop; we had already created something for two hundred students at that point and we had to go back every single step and see where we went wrong, and in those hours of re-correction, it was a lesson over and over again.발음듣기
This is why you have to prototype right away because you are able to get a tool into the hands of users as fast as possible.발음듣기
And then you can iterate from there on out, instead of having to spend any time fixing what doesn't make sense.발음듣기
Typically in education, when we try something new, it's a multi-year process with huge planning teams.발음듣기
And we spend a ton of money and do all these things and then finally give it to students to see if it even is a good idea or not.발음듣기
And in comparison, there's a school that we support at Silicon Schools called Caliber Schools and they took a much more MVP approach.발음듣기
They decided to start a summer prototype to try out their ideas before they even opened their school.발음듣기
Caliber wanted to test out what kinds of support students would need to be successful in their model.발음듣기
Or whether students without much computer experience could be taught to code, or what kind of teachers could be most successful in their model.발음듣기
By building a real laboratory, they could test all these ideas out in-person versus just a hypothetical argument on paper.발음듣기
And just think about how much easier this is to do when you're doing it in a summer school outside of the normal school environment.발음듣기
Just as the theory proves true or doesn't prove true, you can make adjustments much more easily.발음듣기
And after school is actually a very similar space for you to be able to try these sorts of ideas out and have that freedom to iterate.발음듣기
And when you're in this prototyping stage and you're coming up with new ideas, remember how important it is to get lots of different people as part of this process, so you can actually think outside of the box.발음듣기
And so really, the way we started prototyping is we took butcher paper and we just threw up slabs of butcher paper on our wall and we said, "Identify the problem.발음듣기
Let's be a team. And then, as we saw the problems formulating, then we just made mini teams."발음듣기
And we thought through different problems, came up with different proposals, brought it back to the team, and enacted it within a week.발음듣기
So, that was a really powerful way for us to ensure that the teachers were the drivers of innovation.발음듣기
The teachers were a part of the design process because they were the ones who best able to identify the problems right off the bat.발음듣기
Once you're at the stage of trying these things out, this is where you need to collect data.발음듣기
And of course, this can be test scores or quantitative feedback, but I also think it's incredibly important to just dedicate the resources to observe what is happening in these classrooms.발음듣기
It's literally about getting another teacher or a principal or even a video camera if you have no other choice, to really monitor and watch what happens.발음듣기
Because you can learn a huge amount if you study it closely, but typically in education, we just throw a bunch of stuff on the wall and then later try to remember what we thought worked or didn't work.발음듣기
So, my best advice is to just commit to having an observer there who can really watch and process with you, because you can learn a huge amount from every one of these trials.발음듣기
Summit Public Schools does a great job of this where they use focus groups and regular surveys to collect feedback from students about what is and is not working.발음듣기
Understanding student voice and understanding student experience, it's not just to kickstart a design process.발음듣기
You get what students say, then you prototype, and you create an idea, and you put it out there.발음듣기
Then you have to hear what they have to say again and what they think about it, and that is what you use to then continue to iterate forward.발음듣기
The student voice is really the engine of the design process, and for people who are nervous about taking a leap into the unknown and are nervous that this might be...발음듣기
It's nerve-wracking to maybe perhaps straw away from what you're used to and to hurt students is what some people think.발음듣기
But really at the end of the day, when you put students at the engine of the design process, you can't go wrong because you're always coming back to them and they will hold you accountable at the end of the day, to a higher standard than you ever could hold for yourself.발음듣기
And you need to de-risk this for yourself and allow some failure, but that concept of build, measure, learn will let you keep the sort of virtuous cycle of innovation going and you will get to better results.발음듣기
When we started and launched the idea of a playlist for students who are self-directing their learning, they would go to a playlist and be able to select how they would learn before they would then move on to show what they know.발음듣기
And so, we had data that said they weren't learning from the playlists, and, so, that's not what we wanted to accomplish, and we had some ideas about how to improve it.발음듣기
And said, well, you know what, if we take the playlist and we actually divide it into groups and group the resources around a header that says, here's an objective that you want to learn, and here are some resources around it.발음듣기
But we learned and then we went through the cycle again and again and again, and each time got better and better.발음듣기
They have ways where kids can mark what they've already done and keep track of their progress.발음듣기
So, that's an example of going through the cycle multiple times using the measurement, the learning, to continuously iterate and improve to a place where we feel really good now.발음듣기
Sometimes you have a great theory or a perfectly constructed idea, but when it hits the reality of real schools and real students, it all falls apart.발음듣기
Maybe a student had something traumatic happen at home and they come in and they ruin the lesson for others.발음듣기
And it's really easy to throw the baby out with the bath water, but sometimes, you just need to do a different iteration or actually stick with something through the difficult stage while you're learning how to do it even more strongly. So, implementation really matters.발음듣기
And this is where Brian and I would say you really have to trust the gut of actual educators on the ground about when it's worth doubling down on something or when you actually have to step away from something because it's not working.발음듣기
I have a friend who ran a network of schools here in California, and they tried a really thoughtful pilot of a new piece of software and he did it right.발음듣기
And they rolled it out and like happens, he turned his attention to all of the other parts of the job, and a couple months went by, and they got the data back, and it actually hadn't had very great results.발음듣기
So the first thing all the other people involved said was, "See, it's not a good piece of software.발음듣기
It doesn't work. And his response was like, "No, it does work! It didn't work the way we just did it."발음듣기
칸아카데미 더보기더 보기
-
23문장 100%번역 좋아요1
번역하기 -
Philosophy: Epicurus’ Cure for Unhappiness
65문장 100%번역 좋아요1
번역하기 -
94문장 100%번역 좋아요2
번역하기 -
Curiosity: Searching for carbon
56문장 100%번역 좋아요0
번역하기