Serendip is an independent site partnering with faculty at multiple colleges and universities around the world. Happy exploring!

Faculty Learning Community: Agenda and Notes (January 21, 2010)

Anne Dalke's picture

SUGGESTED READING
Jeannette M. Wing, "Computational Thinking." Communications of the ACM 49, 3 (March 2006).

Snacks will be served in the Dorothy Vernon Room of Haffner Dining Hall

AGENDA
:
a discussion of technology in the educational process,
led by John Dougherty and Bill Huber 

Comments

Bill Huber's picture

Notes of the conversation

The following is a partial transcript of the conversation. Frankly, I would prefer to read a summary, but I was delighted to capture (at least in part) the flow of ideas and the measured, thoughtful cadences of the interchange, and so decided to offer this as-is, with apologies in advance for any errors and all omissions.

--Bill Huber, 1/31/10

JD, Paul (PG), Alice (AL), Howard (HG), Mary Angela Papalaskari (MP), Ann Bacon (AB) Abington SD (by phone), Anne Dalke (AD), Bill Huber (WH), Jacqueline Leonard (JL).

***HG (Introductory remarks)***

Technology conversations. Re-looking at what we might want to do. As you have ideas for ways to use it, let’s talk about it and explore it.

Ann (AB) has limited videoconferencing technologies: could conference with one outside location, for example.

Conferences: minimal discounts for faculty from NSTA. Maybe pay for students and alums to go and report back to us? If so, we can send out information. Maybe one or two of this group could go, too.

Alice & Howard will be holding Thursday “education conversations” at Erdman at lunch. Aim is to extend conversation to students who might have no connection to us. Identify students who might be interested in education and math & science offerings, ask them to come along. A place to talk about things that are going on in education, including their experiences or things happening outside the BiCo.

Cross-visits: still open to the idea. Sounded promising. How would we like to proceed? (Work in pairs or triads to observe another person’s classroom and lessons, then talk to the group as a whole about it.) [WH: Open invitation to problem solving sessions].

Jacqueline Leonard presents at next meeting [NB: this was later changed], about collaborating on a project to get funded: Betsy Coleman project. Open to inviting students to that talk.

Noyce project: NSF grant of $1M to fund nine students in 4th and 5th years to become certified as M&S teachers.

Use personal connections to invite students.

***JD (prepared remarks)***

This is how computer science educators see one of our contributions. Not just to produce software engineers, but also to provide a level of proficiency where people can leverage concepts in computer science. This could be as overt as scientific computing, but even within astronomy, e.g., you need computing to get the data, process them, and all these big issues. A body of work studies these things. I’ve been thinking about this a lot. We try to provide it in the curriculum, in summer institutes.

History: In 80’s and 90’s was the idea of literacy: understand an algorithm. “Algorithmic thinking.” Set up a recipe to get something done in a systematic way. In late 90’s, a group convened: “being fluent with IT,” Larry Snyder U. Wash. Push beyond literacy to fluency. Think abstractly about problems and computing. The way you think about might help you do it better: automate it, … Fluency is a nice word: somewhere between literacy and proficiency. Literacy = can use Microsoft Word; fluency = understand that Google docs is a processor, could even figure out how to use Excel as a word processor if you had to. This is an explicit part of Computing Across the Curriculum. Fluency with IT means you can tell people how “FIT” they were. “Computational thinking” is going yet further. Jeanette Wing is the champion. The article was in the flagship magazine for computer science. Idea is to get people ready to use computing, but it’s more about how you think than about how you program or click buttons: it scales. As new things come out, you can make good decisions. Like how to incorporate a mobile phone, how to get a new one, buy a computer, etc. Also to understand what computers can’t do: when to guess, etc.

Some italicized points in the last page are basically about humans, who can be computers. How you can use that to get more done, to do better quality work. Computational thinking should complement what people do: offload stuff to help us do things better. Complementing math and engineering thinking. It has taken off: Carnegie Mellon has an institute.

It’s manifested in many ways. Tech is used in education as technology. Kinesthetic learning (CS Unplugged). People exercises that get people to understand concepts in computing. So technology in education doesn’t necessarily mean computers.
What do you, outside of computer science, want students to know? Where is it in the priority? Want people to write, do math, stats, etc. Where does this fit in?

***General Conversation***

MP: We are used to thinking of computational thinking as relevant, so all students should be exposed to it. Is this just some chauvinistic thing that we [computer science people] think this way, or is it time that educated people should have an abstract way of viewing processes and mechanisms and ways of doing something? That’s what we learn in computer science. A student who learns history who is not a history major learns a concept that transfers to any realm of life. We would like to believe the same is true for computer science. Not only are we faced with an incredible amount of technology to interact with (which we learn when very young), but being familiar with tech and computer literacy is not as much of an issue, but actually understanding some of the fundamentals of computer science and how you analyze a process and what’s going on, how you think about algorithms, whether it’s on the computer or not, is very important. This previous round of pushing for fluency coincided with an explosion of technology that people became familiar with and computers became so easy to use … Students used to take a computer science course so they could run a spreadsheet; it was about literacy. Nobody needs to do that anymore in universities at least. Then it appeared like maybe only some engineers [needed to take computer science courses] … everybody else can use their cellphone and be happy.

HG: By the time students are in university, it’s no longer in fashion. What’s being encouraged at Abington? I assume this computational thinking might not be here to stay: some new paradigm, something on the horizon…

[Enter Jacqueline Leonard (JL) from Temple.]

HG: What’s going on at K-12?

AB: As I look at the skills in this article, the thought coming to mind is that the really interesting way to teach some of them to young students is through the thinking that was behind LOGO. In the 80’s and 90’s we had time to play around, look at recursive thinking, etc. What’s happened now with accountability … has thrown things like that out of the school day, which is very unfortunate for all students, particularly for the best and the brightest, because you will be looking for them. The kind of students we had in the past you will not be seeing now because we don’t have the time. Exactly what are these skills and how do they translate into a curriculum? Gifted and talented, summer program?

JD: A group of us in the Philadelphia area are going to Harrisburg in February [skeleton crew there?] to spearhead an effort (PA is a little behind the average) in terms of K-12 prep for those intending to study computer science explicitly. But here we’re talking about computational thinking for everybody. But there are groups looking for ways to provide opportunities to teach these skills. SCRATCH, ALICE, etc. It’s working backwards: at the undergrad level we’re looking for ways to help K-12. The goal is to have people who can think and appreciate computational thinking. Hopefully we’ll have some good news for you.

PG: For the sake of argument, let me see if I can develop a connection between the report coming from K-12 and some of the issues that you’ve been raising. It seems to me it’s worth noticing that the sequence from literacy to fluency to something else is a sequence which every discipline that I know of has gone through in the past 30 years. The reason why I think that’s worth talking about is that every discipline that has gone through that has ended up coming to the conclusion that there’s something from their discipline that everyone ought to know. Part of the reason K-12 teachers don’t have time is because every discipline has decided on something every student has to know … so there’s more they have to know. No time to play.

What also seems to be interesting is that every time a discipline goes through this, it ends up saying that the critical thing it has to offer is really a general thinking skill. And so one might entertain the idea of all of the disciplines getting together and instead of insisting that something they represent needs to be taught, we might try to get them all together and make a single unified package of all these critical thinking skills.

JD: Wouldn’t that be the dream, having it all integrated? I’ve heard of project-based, … that’s how they kind of emerge. It’s probably effective, but very, very hard.

PG: What would make it a lot easier is if a group of people actually stopped thinking of themselves as biologists or computer scientists or physicists or chemists and stopped asking how to combine all disciplines into a single curriculum, instead asked the more general question of critical thinking skills: how do we develop a course or curriculum that enhances that?

AL: Would you guys say—for you, what you find the most exciting and useful of these computational skills? In education, one key word for education studies is “context.” We are obsessed with context: grounded in a specific social location. You start teaching and learning through that lens. What lenses do you work through as a computer scientist?

JD: This comes up for me in a service course and in a summer institute for teachers. The ability to understand how to debug: how to know there’s a problem in an algorithm and figure out where it is. It sounds like a binary search [for the source of the problem]. The ability to develop a strategy to figure out something that has a problem with it. The other one is appreciating the limits of computing [explaining the spectrum of problem difficulties]: we know a computer is great for [such-and-such]; it could be used but isn’t practical; and there are problems that just are not solvable, even though it feels like it.

AD: When you say “computational thinking,” you have some sense of what that is?

JD: Humans can think computationally—we can execute algorithms—but intuitive thinking, doing things in a more imprecise way—all the things people can do but we can’t do with computers now. That’s kind of the complement of computational thinking: there’s a whole bunch of stuff to human thinking outside it.

PG: That’s a terribly important distinction. If in fact computer science has a singular contribution to make towards helping people learn to think better, it is the distinction between computability and non-computability. So let me make it concrete. If you agree on a set of starting principles—axioms—if you agree on a set of rules that define how you’re allowed to cause starting principles to interact, and if all of that is algorithmic (and deterministic), then you have a computable system. It is a demonstrable feature of that definition that there are certain kinds of problems that humans solve that cannot be solved computationally. They are the things humans solve intuitively or by guesswork. Computational systems cannot solve intuitively or by guesswork. It’s useful to know the distinction.

AD: Evolving Systems group. Tuesday afternoon Arlo Weil presented the deep time concept: sense of the long history of the universe and how different that is from our sense of daily time [/exchange/node/5983 ]. Towards the end, he talked about what geologists do is really not science, because they have data sets like strata filled with fossils, separated by millions of years, so they intuit or guess what happened in between. A scientist would observe, but you can’t actually go back and do that. He was interestingly upset about that. It would feel to me that, sure, this kind of computational work is important, but only if you keep it in play within this other space of filling in the gap. So each discipline has illusions about what’s going on the other ones. … So, how is computational thinking different from what I want my students to do when they’re interpreting literary texts and looking for patterns?

PG: It is different. Because they’re different, they are complementary.

MP: This guesswork part. I think you’re playing with words when you say humans use “guesswork.” They guess plausible solutions and computers can do that too. I don’t see any difference there.

PG: They do. The defining thing is guesswork. Humans, from small ages, are superb at face recognition. Solving that problem algorithmically has proven to be nearly impossible. The reason why infants are capable of solving problems is because they use guesswork as a natural part of their problem solving process.

MP: Human beings could be viewed as computers. I would like to point out that babies have incredible amounts of inputs and processing capacity: that’s why they can do much better than (electronic) computers. They have some algorithm that is being developed in their brain. I don’t say I understand what they do. There is what is happening inside the brain, then there is thinking about what is happening inside human brains: that’s what we’re calling computational thinking. We’re all doing computing in our brains (and computers are computing in their CPUs), but computational thinking is thinking about those things.

AD: So computational thinking is not algorithmic computing.

MP: It’s a human function, a thinking about processes. For example, how biologists can think of a cell in terms of its function, how it’s like an input-output machine. That’s a way of looking at biological processes. About the limits of computation: that’s a fascinating thing. There are concrete things you can show to students. For example, you cannot write a program that checks itself that it never goes into an infinite loop. Our notion of what a computer can do is an intuitive notion … but it is remarkable that when people first turned their attention to algorithms in the 20s and 30s, three different people working independently came up with wildly different definitions, but then subsequently proved they were all equivalent.

JD: Computational thinking is taking things we have learned in computer science and applying them to other places: for example, driving behavior. That’s an application of computational thinking. One of computer science’s contributions is to define that distinction between what is automatable and what is higher-order thinking. Natural language processing is clearly something that computation can simulate, but it’s not at the same level. Sometimes Artificial Intelligence defines something we actually can implement that we thought was a higher order function, but we can now automate. People can use that then. I would rather offload tedious stuff to computers. (David Brooks recently wrote about that.)

MP: There’s also the notion of how much work something takes. You have a network and a postman has to deliver mail without backtracking. You have a similar network, but it’s a snowed street network: what’s the least number of streets to plow to get to everybody. In a way they are similar problems. As the size of the network grows, the problem of plowing the snow is very much doable whereas the postman problem is intractable once you get past 20 or 30 stops: you can only solve it with approximation algorithms.

JL: Your examples are good, but you leave out the notion of culture. A computer can’t deal with culture like a human. Context: where people are situated, and … There’s also smell (in addition to vision), how that affects recognition of a mother. So when we talk about computational thinking and what people vs. computers can do, the human capacity will always be more dynamic because of all the cultural connections we have. For example, all the decisions I made on the trip here. All kinds of things interacted with getting here. In math books there are all these rate-time problems that assume things that aren’t the real world at all. People argue about contextualizing, whether there’s culture in a problem, “naked math” [just the numbers]. Every problem is contextualized somehow; there is someone’s culture there somehow, and it’s always the dominant culture, so it will always be solved better by the dominant culture because it’s part of their experience. How many people study whether the children actually want to engage in the problem? For example, yesterday I was talking with a student about the computerized GREs and how it’s computerized. You can’t do it backwards, for example. My top student once did it that way. Her scores (in 4th grade) came back perfect. But if I as a teacher forced her to do it my way…. But in a computerized test they can’t do that. We’ve lost some things by making that a computerized test. There are human dynamics that people want to opt out of that are part of the process that could improve a GRE or any kind of test score.

MP: Those are interesting issues. You can teach and think about them: what makes us human?

JD: I can simulate a lot of that cultural stuff. Haptic sensors, audio, even smells; and so on. Philosophical debate: is that real or just a simulation? Strong vs. weak AI debate.

PG: Rather than arguing about whether this all computer science or all context, if we’re actually trying to design a curriculum in thinking what we ought to be doing is noticing that there are several distinct particular things and building a course around these distinct particular things. At the moment, there is a useful concept of computability: that’s worth teaching, in part because it indicates some other things. One of them is context: that’s worth teaching. We could assemble a list of these things.

AL: The word “solve”: I have a feeling it’s a term of art. If a baby “solves” the problem of “can I trust this person,” the answer is completely satisfying in that time, idiosyncratically, whereas I think you’re talking about “solve” in a different way. It would be fun to think of different ways we as people face problems, define what a problem is, and what its solution is at different scales and levels of relatedness. It sounds like “solve by guesswork” by your definition is not possible.

MP: This is not unique to computer science. To find a satisfactory solution is to “solve.” Based on what you’ve observed so far, you come up with theory. That’s what science is. Computer science is not that different.

JD: But in the pure sense, computer science has this model. Once you make that model, you’ve defined your universe and define what things are solved. But there are lots of problems where inputs vary, the question is “is it good enough now,” … Is computer science a natural science?

WH: [My recent reading of Rich Mayer’s work may be relevant. He discusses studies of teaching problem solving. We can obtain “deep learning” and transfer of problem-solving skills only in a limited way: namely, the specific skills that are taught. They don’t seem to transfer to other domains. Doesn’t that create a challenge for Dr. Wing’s claims? What evidence is there that teaching about computing will transfer to anything other than computing? If we don’t have any, then this reading seems to lay out an interesting direction for education research.]

PG: An interesting thing is—That the ability of people to solve problems over and above the ones they were challenged to solve in the educational experience needs to be tested, raises an interesting issue. On a narrow level one wants to know whether learning math is generalizable … you can define transferability within a discipline, but you would also like to see transferability across disciplines. The really interesting question is if you try to assess in terms of generality, you can’t test everything. The concept of intelligence quotient: this was originally developed as a mechanism for assessing general intelligence (“g”). You assess this by giving people a whole array of problem solving contexts. It doesn’t in principle matter which ones you give, as long as the array is huge. You now ask the question, is there a correlation in the person’s performance across a wide array of problem solving tasks. Picking up the theme, what you would actually like to see is a generalized impact of a particular educational experience. A way to assess it would actually be using some measure of something like an IQ test. You now ask, as a result of the experience in your classroom, is there a new statistical …

JD: It’s fun, but is it effective?

Anne Dalke's picture

on our airtight compartments

I was so struck, @ the end of our in-person conversation this past Thursday, by Bill's report on the challenges, in the field of education, to the concept of "transferability"--which trouble our presumptions that students can transfer something they've learned in one domain, or @ one scale, to another. Howard's supplied me w/ some recent reading on this topic, and I plan to learn more about this...

...but also wanted to share that, in the interim, in the process of preparing notes for my new course on the James family, I came across a quite-resonant passage. Taken from the May 9, 1891 entry in The Diary of Alice James, it describes how our learning a lot in one area always and provokingly remains in an "airtight compartment," having no effect on "the rest of self," not contributing either to general conversation or "the practical wisdom of life":

what more than anything else makes this estimable race seem so completely foreign, as if we could not have had, possibly, a common descent, is the microscopic subdivision of their knowledge. It is impossible to predicate that supremacy in one accomplishment will...raise to the simplest level of intelligence the whole man, for he carries his gift, which he so often has in great perfection, in an airtight compartment through the walls of which radiates no germinating ardour, and he leaves the rest of himself with a touching, childlike candour, just as Nature made him....the established law, that a child's mind should be dedicated to and perfected in some one study alone...whilst all other fields are left fallow to the seed of accident. This is why things end so short off when you are talking. You ask some question cousin germane to the subject, when all the machinery stops....it discourages and interrupts the social flow to a greater degree than the hit-or-miss flutter and flap we give to the wings of imagination when we feel under foot the distressful ooze of doubt...whilst the quivering Yankee catches up, in the ravelled edges of his culture, simply an approximate knowledge of many things. You are so impressed, at first, when you come by the rounded smoothness of intellectual interchange, and are amazed until...you see that you can make no call of any sort upon the individual for a movement of inspiration....an excresence of one order of knowledge rarely dissolves itself into the practical wisdom of life.

Post new comment

The content of this field is kept private and will not be shown publicly.
To prevent automated spam submissions leave this field empty.
5 + 0 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.