30 December 2013

5 Ways to Pwn Your Math Homework With Google Search

I couldn't finish my homework because I didn't have graph paper.
I couldn't finish my homework because I didn't have my graphing calculator.
I couldn't finish my homework because I didn't have my calculator.
I couldn't finish my homework because I don't know how to use my calculator.
I couldn't finish my homework because I don't know how to convert units.
I couldn't finish my homework because I didn't have any examples.
You've only got to teach math for a few weeks before you've heard every one of those excuses. No one answer is going to be good enough for the kid that goes through all of those, but you can use a simple Google search to perform many necessary tasks to "pwn" your homework and save time.

All of these tips work simply by typing in the search box on Google.com, or using the "omnibox" on browsers like Chrome and Firefox.




1. Calculate.
Typing in calculations such as -1*-3 pulls up an onscreen scientific calculator that gives you an answer and allows you to continue with my calculations.

2. Find and print graph paper.
Your teacher does it this way often, too. :)

3. Graph equations and trace for points.
No need to even go to a graphing calculator app or website. Trace along the line to find intercepts, coordinates, and other key features.

4. Convert units.
You may still need to show your work (especially if you're in a lesson on the topic), but this will at least check your work.

5. Solve systems by graphing multiple equations at once. 
Separate with a comma. Trace along the line to the intersection and approximate the coordinates.

BONUS: Find a video tutorial without going to YouTube.
Many graph or calculation searches will also bring up relevant video results that you can play from the search screen.

For more info, all of the documentation about the Google calculator and grapher can be found here.

29 December 2013

Don't Give Up on Manipulatives!

Literally out of my daughter's mouth -

"Yay! I get to learn my shapes!"



I have a pretty cool app on my iPad for ages 2-8 called Agnitus which the kids play from time to time that has practice in colors, shapes, foods, and more. I like it because it has many free games, sends you progress updates on your kids' learning, but I think sometimes my daughter messes up on purpose. :)





This ad on YouTube even claims its great for the kid who wants to touch everything, which certainly describes my boy.



My kids loves their Kindle time as much as the next, but their kinesthetic learning styles demand tactile experiences that a tablet (as much as it improves the mouse) cannot provide as well as tangible, spatial things.
Montessori Wooden Shape Board, Fraction Sorter on Amazon
I'm often quick to disdain physical manipulatives because they're low-tech, take up space, and often expensive, but I'm its important to remember that (especially with young learners) many kids need learning experiences to be as concrete as possible. Although the fraction blocks above still just represent other things, the kids enjoy puzzle toys like this.

I don't want to condone educational hoarding, but don't throw out the old manipulatives with the bath water when you get that iPad cart.

26 December 2013

Getting Google Forms to your Students

I've seen (and done) several trainings and presentations on the power of using Google Forms for rubrics, quizzes, and other assessment, but how to get said Forms to your STUDENTS is an important aspect of the workflow that often gets overlooked or de-emphasized in the course of the training.

WHY DOES IT MATTER?
You can have the greatest Google form/survey in the world, but if you cannot get it yo your audience/students, the form is worthless. When you're using Forms within your Google Apps domain (with other teachers/staff at your school/district), it can be as simple as emailing the live form to everyone on that domain, but the access gets more complicated when students will be using their own Google accounts (or none at all).

WHAT ARE MY OPTIONS?
The solution you choose to send your students your Google Forms will mostly depend on your set-up, but most often you'll have several of these at your disposal.

WITH STUDENT GOOGLE ACCOUNTS ON YOUR GOOGLE APPS 
1. Set up an contact group with your students' Google accounts for each class (done in Gmail or Contacts). Compose an email to the classes you want to submit responses (type the name of that class in the address field) and link to the live form or embed the form in the email with HTML.
2. Create a doc with a link to the live form and share the document with the classes/students you want to submit responses
3. Take the link to the live form and shorten it with a service like bit.ly or tinyurl.com. Put the shortened link on your white/chalk/SMART/Promethean board or projector.

WITHOUT STUDENT GOOGLE APPS ACCOUNTS
1. Set up an contact group with your students' Google accounts for each class (done in Gmail or Contacts). You'll need to ask them for their Google account first. Some may need to set up accounts. Compose an email to the classes you want to submit responses (type the name of that class in the address field) and link to the live form or embed the form in the email with HTML.
2. Create a doc with a link to the live form and share the document with the classes/students you want to submit responses
3. Take the link to the live form and shorten it with a service like bit.ly or tinyurl.com. Put the shortened link on your white/chalk/SMART/Promethean board or projector.
4. Take the link to the live form and post it to your class website
5. Embed the live form on your class website (Process simplified if you use Google Sites)

ON (SHARED) iPADS
I advise against any solution where you email the link to the STUDENTS because I think you lose more time than on a tower/notebook waiting for students to login to email.

1. Create a doc with a link to the live form and put the doc in shared folder on Drive accessible by the default account for your cart.
2. Take the link to the live form and shorten it with a service like bit.ly or tinyurl.com. Put the shortened link on your white/chalk/SMART/Promethean board or projector.
3. Take the link to the live form and post it to your class website
4. Embed the live form on your class website (Process simplified if you use Google Sites)

ON 1:1 iPADS OR BYOD
1. Create a doc with a link to the live form and put the doc in shared folder on Drive accessible by all the students in your class. You'll need to share the folder with their personal Google accounts.
2. Take the link to the live form and shorten it with a service like bit.ly or tinyurl.com. Put the shortened link on your white/chalk/SMART/Promethean board or projector.
3. Take the link to the live form and post it to your class website
4. Embed the live form on your class website (Process simplified if you use Google Sites)

HOW DO I KNOW WHICH OPTION IS BEST FOR ME?
I have an iPad cart in my room from which students access Drive from a shared account, and I will often use several of these options for one Form. Which solution is best for me usually depends on if I want students to have access to the form from home (on their own accounts) or if we'll just be using it in class). My best advice is to not fall in love with any ONE application.

21 November 2013

Because Weighted Averages - Fighting Back the Points Hounds


Although we're continuing the conversation and including more teachers in standards-based grading training in my district, most all of the classes remain traditionally graded with categories for homework, formative assessments, summative assessments, and finals that were agreed upon by each high school.

It seems like most every class I grade on our traditional grading scale, I'll have kids who start to get homework bugs by November. "I'm doing all of my homework - how come I'm still failing?" is what they'll ask me. No matter the number of times I tell them that those 2 missing homework assignments are not the reason they don't have the grade they want, they insist that "every little bit will still help." While that is technically a true statement, it misses the point. Although we've devalued homework to 10% of our averages to stress assessments and projects in grade calculation (to stress evidence of learning), as long as we have a category that includes homework, the point hounds will be scrounging at the table for their point scraps.

Here's what I conclude:
As long as I include homework in the calculation of grades, I will always fight the students and parents that want to squeak out every homework point "just in case."

It takes persistence and perseverance through the first several weeks as students adjust, but I always feel most successful evaluating my students' learning the more I stress mastery, and I devalue points through standards-based grading.

In the meantime, I created this video to as a resource to my students who want to set goals and track their weighted averages.



06 November 2013

10 Free Tools for your #Flipclass Videos

"We're making a video tomorrow, and I don't even know what we're using to do it, or where we're saving it, or what we're doing after we save it..."

We had the honor of hosting +Jon Bergmann in our district last week, so my colleague is really motivated to begin flipping her classroom, but she's feeling overwhelmed to say the least. So where can she begin?

The Basics: No Frills Approach to Jumping In

  • Shoot video of you and/or a co-teacher in front of a whiteboard with a digital camera, webcam (we have really handy iPEVOs in our district), smartphone or tablet. Upload video to YouTube. Link to that video on your class website, email students the link, or send them to the url for your YouTube channel
    • Features: 
      • Easy to implement
      • tools probably already at hand
    • Cons: 
      • Hope you get it in the first take! :) If you're not going to edit, you'll have to rehearse once or twice to hone your timing and scripting.
      • Sound quality dependent upon built-in microphone
  • Take a screencast of your SMARTboard using the SMART recorder. This "SMART tool" is included with the install of SMART Notebook on your Windows or Mac machine.
    • Features: 
      • Because its a screencast, you can focus on what you say rather than how you look
      • Your students are probably already comfortable with the look of the video
      • Pause button is always on top, so you can pause to change colors, insert images, etc., without adding to the length of your video
    • Cons
      • Some students may miss the personal connection of seeing your face and inflection unless you pair with a webcam in the recording area
      • Similar to shooting with your smartphone or tablet, you'll need to rehearse or end up spending a lot of time editing (which you probably don't want to/know how to do, which is why you're using SMART recorder)

  • Screencast-o-matic: This is my go-to screencast utility. You can use it as a web app or native application. 
    • Features:
      • All of those listed above for SMART Recorder
      • Can layer recording window on top of Notebook, but also over a web browser, PowerPoint, or content-specific software for demonstrations. We have a teacher in our department who loves using Geometer Sketchpad that could really take advantage this point
      • Multiple save options. Save directly to YouTube, Screencast-o-matic, or just to your file on your machine
      • Can scale recording window however you like, hiding other applications you have running without closing them down.
      • Clip, trim, and review clips before saving or continuing
Free iOS Apps:
  • ShowMe - easy to use screencast whiteboard, import images, search library of other teacher's videos 
  • Educreations - easy to use screencast whiteboard, import images, search library of other teacher's videos
  • Touchcast - Can be used as a screencast whiteboard, a camera of you in front of a whiteboard, or you picture-in-picture in front of website/image
  • Ask3 - easy to use screencast whiteboard, import images, classroom community and message board built in. "Ask3" is derived from the strategy "ask three, then me," so the app is designed around students collaborating and answering each other's questions from your video.
  • Vittle Free - import photos, limited to a minute recording in this free version

Other Free Websites and Software with Extra Features
  • Present.Me
    • Features:
      • Easy to set-up
      • Video of you along-side slides or of you alone
      • Trim/clip/review recording if you make a mistake
      • Link to your Google+ account
    • Cons:
      • Requires flash, so not an option on iOS 
  • Jing (from the creators of Camtasia Studio)
    • Features:
      • Available as a native app, so it will be available even when WiFi is not
      • Videos stored at screencast.com
      • Restart a recording if you make a mistake
      • Pair with webcam to shoot yourself, too
Use a tablet to annotate on whiteboard software, or just present alongside a slideshow
Summary
Try out several tools. Find the right combination between polish and ease of use. Give yourself permission to grow in your skills presenting and editing. 

You're more likely to persevere through hiccups of your implementation of the flipped classroom if you'll allow yourself to fail every once in awhile. (So that you know what to improve upon.)

01 November 2013

Google Apps for Education: The Secret is Sharing


As we get further into our second year as a Google Apps for Education district, (and Google continues to refine their product), I see the advantage of using Google for all of my office needs over MS Office is the integration of most of my online life that Google affords me.

We do not have the Office 365 tools that level the playing field, so anytime my ultimate goal is to share, collaborate, or publish to the web, I open up a web browser, not my Microsoft Office task wizard.

The presentation below very briefly reviews most of the Google Apps products that we have enabled for our district, and the bulk of the slides are filled with common teacher scenarios that can be served more efficiently or effective by a Google Apps product.



Other Google Apps Resources

Google Apps for Ed - a weekly digest of links, blogs, video and tweets with Google Apps tips and tricks)
Google Docs, Sheets, and Slides Troubleshooting Tool
Google Docs, Sheets, and Slides Help Center - step by step instructions for many tasks
Using Google Docs with your Students - Training Module

5 Questions for Every Standards Based Grader

I've dabbled with standards based grading in my high school math courses to varying degrees since January of 2010.



If you only read this far, allow me to share one piece of advice so you'll stop being scared, and just get started.

The only "right" way to do standards-based grading in your classroom is the way that is most fair to your students and gives you the best information about their learning. Most other variables will align with district policies, course structures, or personal preference.

Smart people like Robert Marzano and +Shawn Cornally would totally agree. I think. :)

That said, here is my guidance for implementing standards-based grading in your classroom.




Other Resources:

Standards-based Grading FAQ sheet (for students and parents)
Standards-based Grading Digest (weekly links, blogs, videos, articles)
PPT "syllabus" explaining my standards-based scoring
#sbgchat on Twitter (discussion, support, links)

21 October 2013

Socrative 2.0 - "Clicker" Platform Gets Better

I've been a huge fan of iOS/Android/web app Socrative to replace the clickers in my classroom since the first week I had iPads in my classroom (January 2011).

I've used it for exit tickets, bell-work, chapter reviews, guided practice during a lecture, reflection on learning, and even a semester final. My students have ALWAYS complained that if they made a mistake or wanted to skip around, there was no "go back" functionality within Socrative. As flashy and fun as the app is, I've always leaned heavier on Google Forms to give my students that flexibility.

I gave a presentation last week on replacing the Scantron with digital tools that give you richer, quicker feedback and allow you to assess more than multiple choice. I love Socrative as a tool to close what I called the "feedback gap," that time between when you give an assessment, grade it, and then pass it back. What I didn't know was that Socrative had a new version in beta that addressed the "go back" functionality that I mentioned above! You can access the beta version of Socrative 2.0 by both you and your students using beta.socrative.com

The U/I is fresh, but there's more here than that!
Rather than write anything more, I decided to run through Socrative 2.0 briefly and share some of the new teacher and student features.

This video screencasts:
Administering a saved quiz - choosing quiz, setting student paced
Student taking quiz, going back through work and editing responses (Socrative 1.0 did not allow editing or skipping around)
Student submitting response
Viewing specific student responses and results (Socrative 1.0 only showed a student's score)
Finishing the saved quiz and getting a report


I last wrote about Socrative soon after the "insert image" into a question function went public, and it was much needed, but I think I even more appreciate the new student pacing options in Socrative 2.0.

07 October 2013

Writing in Math: Modeling is Powerful


My students encounter writing most in my AP Statistics class. Because of the responsibility to my students to prepare them for an AP exam that will require them to justify the statistical tests they conduct, the conclusions they make, and the observations they draw from graphs, data sets, or computer outputs, I have no choice but to engage them in writing tasks.

Students will not learn to write technically on their own. It's unnatural. It's "hard."
For the majority of my students, AP Statistics is their first exposure to sustained, technical, and descriptive writing. For the most part they've had short answer responses on some quizzes in other courses that will require them to explain "why" they think they're answer is reasonable, or how they came to their answer, but I find myself stretching and pulling all of 1st quarter to get these kids to write more than a sentence per prompt question. 

Consider the following histogram of a roughly symmetric, normal-ish distribution (if I'm losing you there, just know that this bar graph should be symmetric with a high peak in the center and long tails to both positive and negative infinity):

A pretty common prompt in the section where normal curves are introduced would ask something like, "Describe the shape of the distribution." Students usually feel pretty good about themselves if they remember to point out that its symmetric, and has a single peak. If they're getting frisky, they'll mention the tails off to the left and right, and point out the because the distribution is not skewed (with the data clumped in the right or left with a long tail to either side), that we know the mean and median would be roughly value, in the middle of that peak. 

I've come to expect these habits early in the year, so I lean heavy on giving positive feedback for effort (they wrote something), and always give a more specific example of how I would have refined what they said, or I read from my solution manual (sometimes even correcting the manual if I think the manual could have been more specific).

Students cannot know good technical writing without reading technical writing. 
Ironically, I think your textbook is a good place to start, because every section has passages you can easily pull that attempt to succinctly explain vocabulary or walk students through a procedure.

You hope there comes a time in every student's educational career where they stop filling their writing with flowery words that don't mean anything; that they would get to the point, especially in technical writing. However, the pendulum soon swings the other way and students write much less than they should, forcing the reader to assume much of the knowledge the student should be demonstrating.

To refer to the histogram again, I think a student given a prompt of, "Describe the histogram," or "Describe the characteristics of this data set" that had not been introduced to statistical terminology would probably write too much, and still not relay the point about symmetry and the placement of the mean/median. "There are 12 bars on the graph. The first one starts and -3 and goes up a little bit past 0.0..."

I've written all of this so I could share how embarrassingly inspired I was reading this report of a recent Pew Research Survey on the Affordable Care Act, and the role of Republicans vs. the President in the shutdown.

The writer of the report spent several paragraphs describing their methods, the lengths the survey designers took to eliminate response bias from the participants, and examining the bias they were unable to eliminate through their methods. It's just a lot of good, descriptive, writing that will be great for the Experiment Design chapter in the AP Stats curriculum. 
"The analysis in this report is based on telephone interviews conducted October 3-6, 2013, among a national sample of 1,000 adults 18 years of age or older living in the continental United States (500 respondents were interviewed on a landline telephone, and 500 were interviewed on a cell phone, including 250 who had no landline telephone). The survey was conducted by interviewers at Princeton Data Source under the direction of Princeton Survey Research Associates International. A combination of landline and cell phone random digit dial samples were used; both samples were provided by Survey Sampling International. Interviews were conducted in English. Respondents in the landline sample were selected by randomly asking for the youngest adult male or female who is now at home. Interviews in the cell sample were conducted with the person who answered the phone, if that person was an adult 18 years of age or older. For detailed information about our survey methodology, see: http://people-press.org/methodology/. 
The combined landline and cell phone sample are weighted using an iterative technique that matches gender, age, education, race, Hispanic origin and region to parameters from the 2011 Census Bureau’s American Community Survey and population density to parameters from the Decennial Census. The sample also is weighted to match current patterns of telephone status, based on extrapolations from the 2012 National Health Interview Survey. The weighting procedure also accounts for the fact that respondents with both landline and cell phones have a greater probability of being included in the combined sample and adjusts for household size among respondents with a landline phone. Sampling errors and statistical tests of significance take into account the effect of weighting. The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey: 
 
Sample sizes and sampling errors for other subgroups are available upon request.
In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.
"

06 September 2013

The Next Step: Re-wrapping Within Your Instructional Context

We had our fall open house last night, and during the "hour" my Algebra 1 parents were in the room, I was sharing with them my philosophy on graphing. Basically, that wherever, whenever possible, I want kids to use technology to make their graphs so that they can do all the state-standards-type activities: analyze slopes,  make predictions, relate what's going on in an equation and/or table to the behavior of the function.

I set a rule for myself last winter. "Never graph linear equations by hand." This was an easy line to draw in the curricular sand of Algebra 2 because the kinds (theoretically) already had a strong background in graphing linear equations, understanding that the coefficient of the variable was the slope, and how to articulate that between coordinate points.

Can I just as easily draw that line in Algebra 1? Do I believe as firmly in the value of conceptual understanding over procedure, or will I make the (perhaps) easier decision, teach a step-by-step process, and pass the conceptual buck on to the next teacher? (By the way, does anyone feel like the times you most often make instructional concessions against your better judgement for a student that the kid quite frequently ends up being in your class the next year)

So there I was, about to make a grand statement to parents about how their kids were never going to make graphs by hand, and I had to stop short. I literally stopped the sentence.
"I don't know," I said. "Its been a few years since I taught Algebra 1, so there are some things I'm trying to remember..."
"But, the content is easy, right?" a parent asked off to the side.
"Well, yes, but I have to conceptualize it differently between Algebra 2 and Algebra 1."
Is it wrong for there to be a difference, or must we always view our instruction within the context of our course/students' prior knowledge/school environment? Understanding when to re-contextualize that information and tailor it to those students in that year is what I've been learning about math instruction the last year. Lots of people can pick up math content. If content knowledge were most important, wouldn't a student ideally be able to teach an Algebra 1 course after they had mastered it?

The first time you teach a new course (or return to one you haven't had for a few years), there's a temptation to expect that once you reacquaint yourself with the content, that the rest of the year will work itself out. Of course a master teacher is a content expert, but even more importantly, they are adaptable and flexible to make instructional design decisions that they're willing to abandon if the delivery is not appropriate for their students' context.


30 August 2013

Doing Statistics on Scientific Calculators, Grades 6-12

I'd be willing to bet that every approved syllabus on the College Board website for AP Statistics says that a graphing calculator (probably a TI, to be more specific) is required for success in the course. And while that probably IS true for AP Stats, that doesn't mean that all of the other kids doing statistics in your building need one!



Arguments Against Performing Statistical Calculations on Scientific Calculators
Whether you've said these things or know someone who has, I think its a prevalent attitude in schools because I've seen enough math teachers who cringe at the experience they had in college with statistics.

  • "I barely ever get to stats in my curriculum, and when I do, my students just do mean/median/mode. It's a lot easier to just have them calculate that by hand, than teaching them how to use their individual model of calculator."
  • "There's more value in having students perform these by hand so they can practice perseverance and have an understanding where the numbers come from."
  • "If kids want to study statistics, they can do it in high school. We just do means and averages in my class."

Once Again, Common Core Changes Everything
As soon as the 6th grade, students are to be able to use descriptive measures like mean, median, and standard deviation to make decisions. Trust me when I say I don't really rely on a middle school student's ability to consistently compute a variance or standard deviation for a data set using the formula.
You can make the process look simpler, but then you have a big chart on your paper, which also stresses kids out

Here are a few of the standards from 6th to high school: (from corestandards.org)
  • CCSS.Math.Content.6.SP.B.5 Summarize numerical data sets in relation to their context, such as by:
    • CCSS.Math.Content.6.SP.B.5a Reporting the number of observations.
    • CCSS.Math.Content.6.SP.B.5b Describing the nature of the attribute under investigation, including how it was measured and its units of measurement.
    • CCSS.Math.Content.6.SP.B.5c Giving quantitative measures of center (median and/or mean) and variability (interquartile range and/or mean absolute deviation), as well as describing any overall pattern and any striking deviations from the overall pattern with reference to the context in which the data were gathered.
    • CCSS.Math.Content.6.SP.B.5d Relating the choice of measures of center and variability to the shape of the data distribution and the context in which the data were gathered
  • CCSS.Math.Content.7.SP.A.2 Use data from a random sample to draw inferences about a population with an unknown characteristic of interest. Generate multiple samples (or simulated samples) of the same size to gauge the variation in estimates or predictions. For example, estimate the mean word length in a book by randomly sampling words from the book; predict the winner of a school election based on randomly sampled survey data. Gauge how far off the estimate or prediction might be.
  • CCSS.Math.Content.HSS-IC.B.3 Recognize the purposes of and differences among sample surveys, experiments, and observational studies; explain how randomization relates to each.
  • CCSS.Math.Content.HSS-IC.B.4 Use data from a sample survey to estimate a population mean or proportion; develop a margin of error through the use of simulation models for random sampling.
  • CCSS.Math.Content.HSS-IC.B.5 Use data from a randomized experiment to compare two treatments; use simulations to decide if differences between parameters are significant.
  • CCSS.Math.Content.HSS-IC.B.6 Evaluate reports based on data.
  • CCSS.Math.Content.HSS-ID.A.4 Use the mean and standard deviation of a data set to fit it to a normal distribution and to estimate population percentages. Recognize that there are data sets for which such a procedure is not appropriate. Use calculators, spreadsheets, and tables to estimate areas under the normal curve.
How Are My Kids Going To Do All of This??
The good news: A built-in function to most every scientific calculator is the ability to enter a simple list and run a 1-variable stats analysis to get (at the very least) the distribution's count, mean, variance, and standard deviation.
More good news: According to Smarter Balanced's testing manual, students taking the high school tests will have access to statistical calculators on the test.
The bad news: The keystrokes are a bit different on every model, so teaching your students to use their scientific calculators to get measures of center or spread from a dataset will have to be more about principles of the process (entering the data into a list, finding the button/menu that has your mean/median/standard deviation in it), than it will be about walking through specific keystrokes with the whole class.

Tutorials To Share
You don't have to be an expert. Watch these yourself, share with your students on Edmodo or your class webpage and students can review the video relevant to their needs.

TI-30XS (Multiivew)


TI-30XA


TI-30X IIS




Casio fx-991ES


Casio fx-85ES


Casio fx-83MS


These are all the calculators I see MOST frequently in my lower level math classes. If you or your students have a different model, a simple Google or YouTube search with "statistics on [your model here] should at least get you started.




27 August 2013

Emergency Lesson Plans: AP Statistics

Someone came to the blog via this search a few days ago and found this post about my son's hospitalization as catalyst for my decision to finally MAKE emergency sub plans, but I'm sure they left disappointed, because I didn't actually share my plans in that post. :)



Principles for Emergency Planning
In the previous post about emergency sub plans, I laid out these principles I was going to follow as I crafted my plans:
  1. When choosing learning objectives, I'm going to focus on things we are currently addressing in our PLC Smart Goal, or critical skills that consistently need reinforcing. 
  2. Have a back up to the emergency plans in case students finish quickly
  3. Crowdsource your emergency plans with co-teachers.
  4. To give yourself ultimate flexibility (particularly in your emergency back-up), keep a class set or at least enough for 2 kids to share of your course textbook in a closet or cabinet nearby.
  5. Make it something you'll at least consider grading when you return.
Emergency Planning for AP Statistics
I have three thematic goals for my AP Stats students throughout the year: data gathering, data crunching, and technical writing, so all of the options listed here, so if I stick to those goals (principle #1), I'll have something that can be mostly applicable for students no matter where we are in the curriculum.
  • AP Practice Exam: The College Board releases items from old exams on their website. Might as well take advantage.
    • 1997 Released Practice Having students complete this all in one hour would be an impossibility, so you've got at LEAST two days of material here. I would have students work in pairs on either the multiple choice or free response and collaborate quietly.
    • Free AP Stats Practice Exam This test is secured behind your College Board Course Audit login, so be sure to have that info handy.
  • Data Collection/Experiments
    • "How Long Is a Minute" Materials and resources are usually intentionally scarce on guest teacher days, so its impractical to have plans that require elaborate handouts or materials list. This experiment is my current emergency plan, and I love it because its easily approachable early in the year before we've done a TON of stats work, and only requires a clock in the classroom with seconds.
    • Probability Simulator TI 83/84 app + this handout The handout is actually for a more extensive project, so you'd have your students BEGIN the project on this day (and work on it periodically over a matter of days), or just have students complete a portion of the work.
  • Probability Based Gaming


26 August 2013

Data Doesn't Have to Be Scary - How You Get it Matters

Do you ever feel like you're left with a pile like this after you've collected data for a survey or given a common assessment that needs to be graded and entered into a spreadsheet?
Yeah, I Know Someone Like That...
Gathering data to use for your building goals, data teams, or class statistics projects doesn't have to end with you and/or your colleagues spending hours in front of a spreadsheet entering values (and inevitably making errors). This TED Talk from global health consultant and pediatrician Joel Selanikio (@jselanikio) chronicles his experience with paper surveys in the developing world. What Dr. Selanikio realized from his field observations was that no matter how many nurses tramped through jungles door-to-door surveying families on child births, immunizations, deaths, and the like, dealing with the data was a cumbersome, daunting task that usually was abandoned. The result was "data-driven" decision making on vaccine supply that was formed from very small, very incomplete datasets.




What's This Mean for Me as a Teacher?
Dr. Selanikio's story is about the success of cloud-based, user-friendly, digital data collection methods, and I think we can experience the same in our schools. Beyond the results your state sends weeks (months?) after your spring standardized testing, how much of the data you gather about the student learning at your school is ever compiled electronically?

I think a lot of the one-more-thing mentally that many teachers associate with data teams in their schools is the result of their experiences grading everything by hand, tallying the scores on paper or in their grading software, and THEN taking the scores and (hopefully) copying and pasting them into some common spreadsheet. So how can you experience a similar explosion of efficiency like observed in the TED Talk?

If it cannot be gathered electronically, don't collect the data.

The Shift
Only collecting data electronically MAY mean that you and your team have to rewrite some assessments, but that might be for the best, anyway. Which is more valuable, asking a student to graph the equation of a line in slope-intercept form, (which would be hard to enter on a spreadsheet) or having the student graph the equation of a line and then tell something about its slope, intercept, or coordinates? Once students are taking what they do graphically and making textual conclusions or inferences, those responses can easily be matched and manipulated in Excel.


What Ways Can I Gather Data Electronically in My Classroom?
The secret is in taking advantage of mobile devices that you and your students already have.
Try an of these on for size:


12 August 2013

Never Smile Until 2nd Quarter (and Other First Day Myths)

I think the "never smile" rule was one of the first things my cooperating teacher told me before our first day before student teaching. That and wearing ties. But who made that up? Why is it still perpetrated? Here's a list of reasons I think to save our smiling for the 2nd Qtr.

1ST DAY CHECKLIST
Alienate students.
Inspire dread at the thought of coming to my class.
Ensure students don't think I care.
Build a connection between cold personal interactions and negative atmosphere toward learning my content.
Quench any fun.


I don't know about you, but I have a better time teaching the more I smile through the day. I cannot imagine getting through weeks of school forcing myself to keep a straight face in the midst of the silly things kids do and say. For another perspective on "never smile," here's a post middle school principal +Shawn Blankenship wrote a couple years ago. "Never Smile Until Christmas"


What about some others?

1. KIDS DON'T WANT TO DO ANYTHING THE FIRST DAY
Correction. They don't want to hear a lecture and produce a worksheet the first day. Of any days, the first day should perhaps be the MOST engaging. The first day of school is a time for first impressions for you just as much as for the students. Most of us would go out of our way to seem extraordinarily amazing and interesting on a first date or job interview - why don't we pull out the stops for that first experience for our students? There'll be other days for book cards.

Today in AP Stats, I'll be taking advantage of the massive crowds of schedule-changers and ID photo takers to provide a captive audience for my students' first attempts at data collection, and Algebra 1 will be developing linear expressions for modeling toothpick figures (although they won't know that until day 2)

The best example I've seen that wasn't content related is my friend Beth. She brings a whole duffel bag of stuff in and shares the experience of each trinket with her students. It really sets the stage for establishing a safe environment for sharing in her classroom.

2. I'LL NEVER LEARN ALL MY STUDENTS NAMES
Are you a "hit the ground running" teacher? I'm sure you probably cover more content than me, but at what expense relationally? Spend time in activities the first day that get you and your students talking to each other and saying each other's name. Keep up the deliberate name-learning activities for several days.

3. NEVER GIVE HOMEWORK/ ALWAYS GIVE HOMEWORK THE FIRST DAY
I think this one is hyper-contextually subjective. If you're teaching an honors, AP, or upper-level course, or intend to be super-dedicated and habitually with when students can expect homework, I think it sold be completely appropriate. If you're doing it to seem tough and shock some kids into changing their schedule, my opinion is that kids usually see through the ruse.

4. YOU MUST DISCUSS EXPECTATIONS AND PROCEDURES AS A CLASS
Definitely. Early and often. But going back to #1, find a way to make a game of it; get kids moving around.

Are there any others you've heard? Any you disagree with?

11 August 2013

Teaching Argument Writing...for Preschoolers! (and Anyone Else)


I had the pleasure of attending a two day training over the summer with the Gateway Writing Project about teaching argument writing and how we can use it to support the need for evidence-based writing in the Common Core ELA and math standards. 


  • CCSS.ELA-Literacy.W.9-10.1 Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence.
  • CCSS.ELA-Literacy.W.6.1 Write arguments to support claims with clear reasons and relevant evidence.
  • CCSS.Math.Practice.MP3 Construct viable arguments and critique the reasoning of others.





Although the requirements for thinking abstractly and putting together objective, argument pieces are not introduced until grade 6, the foundation of vocabulary, thinking and process of gathering evidence, making claims, and evaluating warrants could ideally begin in earlier grades as students write explanatory pieces.

Before I get ahead of myself, let me quickly define for you a warrant and a claim as from this book by George Hillocks, Jr., Teaching Argument Writing, Grades 6-12.


"Warrants may be common sense rules that people generally accept as true, laws, scientific principles or studies, and thoughtfully argued definitions." (Hillocks, pg xxiii)
Claims are the statement or value you are trying to prove the evidence supports. (Hillocks, pg xix)












But What Does That Have To Do With Teaching Preschoolers?

I had the pleasure of spending so much time watching my children, 3 years, 8 months, and 2 years learn and play this summer, so most of what I'm processing as a teacher right now is through my lens as their teacher this summer. Also, if you can communicate an idea to a 3 year old, you know you're set for the intended audience. :)

Of course, I wasn't sitting down with my daughter this evening and discussing vocabulary with her - we weren't even writing anything. Our exposure to argument and reasoning was during story time before bed.


Breaking It Down

We read Pinkalicious: The Princess of Pink Slumber Party, which recently came from the library.








The plot of this story doesn't really matter. You just need to know that Pinkalicious has a slumber party, and one of the friends ends up having a fear of falling asleep at another house.

Pinkalicious gets the girl to imagine various sounds and smells around the house are from a happy guardian dragon.










We met the dragon and I thought, "Interesting time to test some reasoning," so, very naturally, in my best inquisitive voice, I asked Lucy, 
"Do you think this is a mean dragon, or a nice dragon?"
"It's nice."
"How do you know?"
"Because its smiling!"  ::she points to the dragons mouth::
"Ah, and so usually when people are nice, they smile."
We continue on the story, and I'm happy to report, the little girl has no problem falling asleep.


Connecting to Terminology

Lucy did not produce an entire argument on her own, of course, but would you even expect that of all your middle schoolers or 9th graders? With some scaffolding questions, she was able to show the bones of some basic reasoning, however.

Claim: In response to my leading question, Lucy's claim is that This is a nice dragon.
Evidence: When I asked Lucy how she knew, she easily pointed out that the dragon was smiling.

In the experience of my own classroom with something like classifying functions, the process has been the same. I might ask, "Is this a quadratic function," to which a student at first may only respond, "Yes," but after I ask, "How do you know," she will often be able to point out a defining characteristic from a table, graph, or equation.

Warrant: As defined by Hillocks, a warrant is often something generally known. Kids learn quickly that they can usually trust adults or other kids that are smiling. Lucy left this off of her "argument," but warrants are the bones that support claims.

While warrants in an English or Social Studies class may be a little more subjective, I think STEM subjects generally have a stronger leg to stand on when picking out and using warrants. I explained warrants to a colleague today in the math office as being the properties and laws our students (often) write down and (rarely ever) use in their problem solving.


In the Classroom

The first several times you attempt argument writing with your students, I think it may end up looking and feeling a lot like my conversation with Lucy. I think that has to be okay.  Use guiding questions in pre-reading. Support them with definitions and clarify/refine their usage of warrant/ claim/ counterclaim. Restate their conclusion so they can have a chance to analyze if it "sounds" right once they've heard it outside of their own head or from their paper.

09 August 2013

Tech Integration Tip: Try New Things With One Class At A Time

You've got a new piece of hardware, app, or website you want to try out in your classroom. You've thought through some of your students' potential difficulties, what could potentially go wrong, and what you can do to mitigate those problems, but there is still an element of the unknown until you or a colleague try it out in the context or your own district or school.

"Pilot!" you think, excited to see what may come of your idea. But who pilots for the the early adopters in your building? What about the late adopters who are slow to adopt a tool until they've seen it work in their classroom?


Where Do I Start?
Pilot for yourself! I got the idea of using my own classroom as both experimental and control group from a university study on clickers I found while working on an action research plan in graduate school. The professors wanted to incorporate and investigate the use of a student response system (clickers) for generating feedback for their students, but had trouble getting colleagues to participate in their study.

To get as large of a test/control group population as they could, the professors each had half of their students acting as control and half acting as experiment groups for a portion of the semester. Having students in the same section serve as both control and experimental groups in the course of the study mitigated variables of individual student achievement, hour of the day, day of the week, and individual teaching style. It boiled the clicker study down to this question:
Is there a significant different in these students learning using clickers to generate immediate feedback, or not?
What's that mean for my team and my classroom?
I'm sure my data team has not been alone in the past in wondering when we collect data on common assessments if the differences in student scores were the result of any of several factors, including but not limited to teaching style, hour of the day, and students in the class. These are all factors that in comparing two sections of the same course are important to consider.

Establishing (and rotating) experimental and control classes within your day eliminates all of the uncertainty of what those other factors may be contributing and narrows your focus to (as close as possible) ONE single variable.

Can it be replicated?
Once ONE of the teachers in your subject area or data team have had positive results using a new technology or strategy, others can have confidence in trying it out for themselves, in their own context.

04 June 2013

ClassBadges.com and Standards-Based Grading

The badge art I uploaded.
 Kids LOVE competing, right?

What I appreciate about the grading reform away from A-F that we've seen the last several years is that there is less comparison between letter grades amongst students and more emphasis on "what do you know?" More students have a chance to be the "smart" ones because their failure on Standard X does not always mean they will fail on Standard Y, which traditional letter grades can suggest and sometimes lead to.

However, standards based grading can get a little black-and-white cut and dry sometimes, and kids can get overwhelmed by charts, so why not gamify your students' standards growth and get your kids to compete with each other than against?

This summer I was "strongly encouraged" to use Accelerated Math, a system we use mainly for intervention during the school year. AM is an adaptive, differentiated learning system that links in with the STAR assessments which everyone in the district uses for benchmarking. I'd been previously trained on AM, but never actually used it myself, but one thing I'd remembered from my colleagues' feedback was that AM is a great program for kids that can set goals, kids that can pace their work, and kids that can monitor their own on-task behavior. As I mentioned in my last post about summer school, these are not usually qualities I see in my students, so to mitigate the summer being a total disaster for 3/4 of my class, I decided to use classbadges.com to track their objective mastery (and other, more PBIS-type accomplishments as well.)

Above you'll see all of the badges I created to award to the students. Technically our program this summer is "credit recovery," so they need only to get 60% or higher (the black token) on the objectives I assign in Accelerated Math, but I knew some kids would work harder to get better badges (just like someone might spend hours getting a certain achievement in a video game), so I made the grey "master" badge for 80%, and the gold "expert" badge for 95% scores on an objective.

I'm still going to work on more traditional goal-setting with my students, and today's early results with the Pomodoro Technique were positive, but these badges were a fun way to give a few kids an extra, whimsical incentive.

31 May 2013

5 Reasons Why Teaching Summer School is Good for Professional Development

I'm beginning my 6th edition of summer school for my district on Monday. My first summer, before we had children and my wife was still working, summer school was a means to cut my teeth in my own real-life math classroom before the fall semester began and stuff got real. 

Teaching summer school is more of a financial requirement for our family now, but there are still several things I enjoy about the summer session that I think make me a better teacher.

1. "Do your worst."
If you can handle what summer school students have to throw at you, you'll probably be prepared for the worst you may see during the regular session. Some summer school students are highly motivated (which is what I expected of all of them before my first summer), and they are a delight, but most in my district come to me with one or several of the following: immaturity, frustration, anger, dejection, ambivalence, complacency. I jumped over a table last summer in the midst of what was looked to be a fight brewing. It was awesome.

2. Try new things.
The thinking on this is that for most of these kids, whatever traditional activities you or a colleague tried during the spring or fall were not successful strategies, so repackaging the regular curriculum into a shorter chunk is asking for boredom at best, and more failure at worst. I feel less pressure to have my lessons or activities go perfectly during the summer because classes are smaller and we meet longer, so its more feasible to clarify directions and completely change course if necessary without sacrificing an entire day's 50 minute period.

I love piloting projects, games, and software in the summer.

3. Motivate, motivate, motivate.
This is for my students, too, but secretly maybe for me the most. :)
There aren't a ton of self-starting, goal-setting 16 and 17 year olds to begin with - you're definitely not going to find them in summer school. Teaching my summer students forces me to rethink why I teach, and what heights are possible for any of these students. 

Morning-grump-Chuck does not fly in summer session.

4. Make new connections with colleagues
Since my district consolidated summer school for our 3 high schools into one building 3 years ago, summer school has meant working more closely with teachers I only see on sporadic district PD days. Collaborating with these teachers during the summer has given me a clearer picture of what goes on across my district and helps me make sense of goals and vision given to us from administration across the street. Most of the trainings on this page were made possible by connections I made during summer school.

5. Prioritize and dump.
You know that practice where you teach some things during the regular school year "for exposure," just because its in your book, or you like that trick? 

Reteaching an entire semester's worth of content is obviously impossible, so the summer session requires refinement of the curriculum to essential topics and strategies. Fortunately for me, my district already has separate pacing guides for the summer, so I don't have to reinvent the wheel every summer.

This year, to cut costs, our district is holding only "credit recovery" courses, which the state allows for students who were close to passing. My class will be 2 weeks instead of the usual 4, so I'm forced to prioritize and predict where the biggest needs will be amongst my recovering students.

Why does this matter for the regular session? The topics we end up covering in summer school often become the subject of smart goals and data collection from common formative assessments in our PLCs.