Medium

References with Module Alignment

Ainsworth, L. (2015). Common formative assessments: How teacher teams intentionally align standards, instruction, and assessments. Thousand Oaks, CA: Corwin.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment

American Association of School Administrators. (2002). Using data to improve schools: What's working? Arlington, VA: Author.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment

Ames, C., & Archer, J. (1988). Achievement goals in the classroom: Students' learning strategies and motivation processes. Journal of Educational Psychology, 80(3), 260-267.

  • Module(s): 6, 7
  • Relevance: Goal setting benefits

Anderson, N., Brockel, M. & Kana, T. E. (2014). Disciplined inquiry: Using the A+ inquiry framework as a tool for eliminating data hoarding, mindless decision-making, and other barriers to effective ESA programming. Perspectives: A Journal of Research and Opinion About Educational Service Agencies, 20(3).

  • Module(s): 1-13
  • Relevance: A+ Inquiry framework

Bernhardt, V. (2013). Data, data everywhere: Bringing all the data together for continuous improvement. New York, NY: Routledge.

  • Module(s): 1-13
  • Relevance: Data types - student learning, demographic, school process, perception

Burnaford, G. (2012). Assessment, evaluation, research: What’s the difference? Why should we care? Chicago, IL: Chicago Arts Partnerships in Education. Retrieved from http://www.artsassessment.org/wp-content/uploads/2012/02/Assessment.Eval_.Research.Burnaford.pdf

  • Module(s): 1
  • Relevance: Assessment, research, and evaluation definitions

Chappuis, J., & Stiggins, R. J. (2017). An introduction to student-involved assessment for learning (7th ed.). New York, NY: Pearson.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment, assessment for learning, assessment of learning

Council of Chief State School Officers. (2013, April). Interstate Teacher Assessment and Support Consortium InTASC model core teaching standards and learning progressions for teachers 1.0: A resource for ongoing teacher development. Washington, DC: Author.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment

Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed-methods approaches (4th ed.). Thousand Oaks, CA: Sage.

  • Module(s): 1
  • Relevance: Research examples

Earl, L. M. (2013). Assessment as learning: Using classroom assessment to maximize learning (2nd ed.). Thousand Oaks, CA: Corwin.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment, assessment for learning, assessment of learning

Eggen, P., & Kauchak, D. (2012). Strategies and models for teachers: Teaching content and thinking skills (6th ed.). Boston, MA: Pearson.

  • Module(s): 2
  • Relevance: Critical thinking

Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. L. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Upper Saddle River, NJ: Pearson.

  • Module(s): 1, 3, 4
  • Relevance: Formative evaluation, summative evaluation, evaluation examples

Geier, R., & Smith, S. (2012). District and school team data toolkit. Everett, WA: Washington Office of Superintendent of Public Instruction, Washington School Information Processing Cooperative, and Public Consulting Group.

  • Module(s): 1, 5
  • Relevance: Formative assessment, summative assessment, criterion referenced assessment, norm referenced assessment

Glesne, C. (2010). Becoming qualitative researchers: An introduction (4th ed.). Boston: Pearson.

  • Module(s): 1
  • Relevance: Research definition

Intervention Central. (n.d.). Listening passage preview. Retrieved from http://www.interventioncentral.org/academic-interventions/reading-fluency/listening-passage-preview

  • Module(s): 8
  • Relevance: Listening passage preview

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications_reviews.aspx

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment

Hasbrouck, J., & Ihnot, C. (2007). Curriculum-based measurement: From skeptic to advocate. Perspectives on Language and Literacy, 33(2). Retrieved from http://www.rtinetwork.org/essential/assessment/progress/cbm-advocate

  • Module(s): 8
  • Relevance: Practice effect

Hausknecht, J. P., Halpert, J. A., Di Patio, N. T., Moriarty Gerrard, M. O. (2007). Retesting in selection: A meta-analysis of coaching and practice effects for tests of cognitive ability. Journal of Applied Psychology, 92(2), 373-385.

  • Module(s): 8
  • Relevance: Practice effect

Jenkins, J., Hudson, R., & Lee, S. (2007). Using CBM-reading assessments to monitor progress. Perspectives on Language and Literacy, 33(2). Retrieved from http://www.rtinetwork.org/essential/assessment/progress/usingcbm

  • Module(s): 8
  • Relevance: Curriculum Based Measurement protocol

MetaMetrics. (2008). Lexile measures in the classroom. Dunham, NC: Author.

  • Module(s): 9
  • Relevance: Lexile Measures

National Center on Response to Intervention. (n.d.). Using progress monitoring data for decision making [webinar transcript]. 

  • Module(s): 8
  • Relevance: Progress monitoring baseline calculation

National Center on Response to Intervention. (2010). Essential components of rti - a closer look at response to intervention. Washington, DC: U.S. Department of Education, Office of Special Education Programs, National Center on Response to Intervention.

  • Module(s): 5
  • Relevance: Response to Intervention (RTI) definition, purpose of universal screening

National Center on Response to Intervention. (2012, June). RTI implementer series: Module 1: Screening - Training manual. Washington, DC: U.S. Department of Education, Office of Special Education Programs, National Center on Response to Intervention.

  • Module(s): 1
  • Relevance: Universal screening methods and vocabulary, cut score definition, reliability, validity

National Center on Response to Intervention. (2012, July). RTI implementer series: Module 2: Progress monitoring - Training manual. Washington, DC: U.S. Department of Education, Office of Special Education Programs, National Center on Response to Intervention.

  • Module(s): 1, 8
  • Relevance: Progress monitoring methods and vocabulary, baseline calculation

National Forum on Education Statistics. (2016). Forum guide to education data privacy (NFES 2016-096). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

  • Module(s): 8
  • Relevance: Data privacy

North Dakota's Multi-Tier System of Support. (2016, April). NDMTSS essential components summary. Retrieved from http://www.ndmtss.org/wp-content/uploads/2016/01/EssentialComponentsSummaryFINAL-2.pdf

  • Module(s): 5
  • Relevance: NDMTSS essential components

Northwest Evaluation Association. (2014, September). MAP reports reference for the Web-based MAP system. Portland, OR: Author.

  • Module(s): 5-7, 9-13
  • Relevance: Mockup report examples

PBISApps. (2016, May). SWIS 5.6 office referral form definitions. Available from https://www.pbisapps.org/Resources/Pages/SWIS-Publications.aspx

  • Module(s): 1
  • Relevance: Behavior data examples

Perie, M., Marion, S., & Gong, B. (2009). Moving toward a comprehensive assessment system: A framework for considering interim assessments. Educational Measurement: Issues and Practice, 28(3), 5-13.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, interim assessment, summative assessment

Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks, CA: Sage

  • Module(s): 1, 3, 4
  • Relevance: Formative evaluation (e.g., needs assessment, process evaluation), summative evaluation

Searle, M. (2010). What every school leader needs to know about RTI. Alexandria, VA: ASCD.

  • Module(s): 5
  • Relevance: Response to Intervention (RTI) tier percentages

Senko, C. (2016). A story of early promises, eventual discords, and future possibilities. In K. R. Wentzel (Ed.), Handbook of Motivation at School (pp. 75-94). New York, NY: Routledge.

  • Module(s): 6, 7
  • Relevance: Goal setting benefits

Statewide Longitudinal Data Systems Grant Program. (2015). SLDS data use standards: Knowledge, skills, and professional behaviors for effective data use, version 2. Washington, DC: U.S. Department of Education, National Center for Education Statistics.

  • Module(s): 1-13
  • Relevance: Statewide Longitudinal Data System (SLDS) data use standards

Stenner, A., Burdick, E., Sanford, E., & Burdick, D. (2007, April). The Lexile framework for reading technical report. Durham, NC: MetaMetrics.

  • Module(s): 9
  • Relevance: Lexile Measures

Stronge, J. H., & Grant, L. W. (2009). Student achievement goal setting: Using data to improve teaching and learning. New York, NY: Routledge.

  • Module(s): 6, 7
  • Relevance: Goal setting purpose, benefits of goal setting, cautions of goal setting, SMART goals

Thum, Y., & Hauser, C. (2015). NWEA 2015 MAP norms for student and school achievement status and growth. NWEA Research Report. Portland, OR: NWEA.

  • Module(s): 5-7, 9-13
  • Relevance: Percentile categories

Tomlinson, C. A. (2000). Differentiation of instruction in the elementary grades. Retrieved from ERIC database. (ED443572)

  • Module(s): 9
  • Relevance: Differentiated instruction

Tomlinson, C. A., & Moon, T. R. (2013). Assessment and student success in a differentiated classroom. Alexandria, VA: ASCD.

  • Module(s): 1, 3, 4, 9
  • Relevance: Formative assessment, summative assessment, differentiated instruction

U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. (2010). Use of education data at the local level from accountability to instructional improvement. Washington, DC: Author.

  • Module(s): 1, 3, 4
  • Relevance: Formative assessment, summative assessment

Usher, A., & Kober, N. (2012). Can goals motivate students? Washington, DC: The George Washington University, Graduate School of Education and Human Development, Center on Education Policy.

  • Module(s): 6
  • Relevance: Benefits of goal setting, cautions of goal setting, types of goals, characteristics of effective goals

Other Relevant References

Consulted During Curriculum Development

  • AdvancED. (2013). Guidelines for creating the stakeholder feedback data document. Retrieved from http://extranet.advanc-ed.org/assist_resources_and_tools/docs/stakehold…
  • AdvancED. (2013). Guidelines for creating the student performance data document. Retrieved from http://extranet.advanc-ed.org/assist_resources_and_tools/docs/assessmen…
  • AdvancED. (2015, June 25). AdvancED accreditation policies and procedures for AdvancED accreditation. Retrieved from http://www.advanc-ed.org/sites/default/files/documents/AdvancED-Policie…
  • AdvancED. (2016). AdvancED: North Dakota 2014-2015 overall survey averages. Author.
  • American Institutes of Research. (2016, November 2). North Dakota State Assessment online reporting system user guide: 2016-2017. Washington, DC: Author. Retrieved from http://ndsa.portal.airast.org/wp-content/uploads/ND_ORS_Product_Guide.p…
  • Bernhardt, V. (1998, March). Multiple measures (Invited monograph No. 4). California Association for Supervision and Curriculum Development (CASD).
  • Bernhardt, V. (2007). Translating data into information to improve teaching and learning. New York, NY: Routledge.
  • Bernhardt, V. (2013). Translating data into information to improve teaching and learning. New York, NY: Routledge.
  • Bohlin, L., Durwin, C. C., & Reese-Weber, M. (2012). EdPsych modules (2nd ed.). New York: McGraw-Hill.
  • Booth, W., Colomb, G., & Williams, J. (2008). The craft of research (3rd ed.). Chicago, IL: The University of Chicago Press.
  • Borden, V. M. H. (2002). Information support for assessment. In T. W. Banta (Ed.), Building a scholarship of assessment (pp. 167-181). San Francisco, CA: Jossey-Bass.
  • Brown, J., & Skow, K. (2009). RTI: Progress monitoring. Retrieved from http://iris.peabody.vanderbilt.edu/wp-content/uploads/pdf_ case_studies/ics_rtipm.pdf
  • Castellano, K., & Ho, A. (2013). A practitioner’s guide to growth models. Council of Chief State School Officers. Retrieved from https://scholar.harvard.edu/files/andrewho/files/a_pracitioners_guide_to_growth_models.pdf
  • Centers for Disease Control and Prevention. (n.d.). Evaluation types. Retrieved from http://www.cdc.gov/std/Program/pupestd/Types%20of%20Evaluation.pdf
  • Connecticut State Department of Education. (2014). Student learning goals/objectives: A handbook for administrators and teachers. Middletown, CT: Author.
  • Corbin, J., & Strauss, A. (2015). Qualitative research: Techniques and procedures for developing grounded theory (4th ed.). Thousand Oaks, CA: Sage.
  • Deno, S. (n.d.). Ongoing student assessment. Retrieved from http://www.rtinetwork.org/essential/assessment/ongoingassessment
  • Dynamic Measurement Group. (2012, October 27). Progress monitoring with DIBELS Next. Retrieved from https://dibels.org/papers/ProgressMonitoringGuidelines.pdf
  • easyCBM. (2014, August 5). Interpreting easyCBM test results. Retrieved from https://app.easycbm.com/static/files/pdfs/info/ProgMonScoreInterpretati…
  • Engel, R. J., & Schutt, R. K. (2008). Single-subject design. Practice of research in social work (2nd ed.) (pp. 206-246). Thousand Oaks, CA: Sage. Retrieved from http://www.sagepub.com/sites/default/files/upm-binaries/25657_Chapter7.pdf
  • Engel, R. J., & Schutt, R. K. (2017). The practice of research in social work. Thousand Oaks, CA: Sage.
  • Fletcher, J. M., & Vaughn, S. (2009). Response to intervention: Preventing and remediating academic difficulties. Child Development Perspectives, 3(1), 30-37.
  • Fuchs, L. S., & Fuchs, D. (2011). Using CBM for progress monitoring in reading. Retrieved from http://files.eric.ed.gov/fulltext/ED519252.pdf
  • Gayle, G. H., & Kuzmich, L. (2014). Data driven differentiation in the standards-based classroom (2nd ed.). Thousand Oaks, CA: Corwin.
  • Gerzon, N., & Guckenburg, S. (2015). Toolkit for a workshop on building a culture of data use (REL 2015–063). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast & Islands. Retrieved from http://ies.ed.gov/ ncee/edlabs
  • Good, R.H., & Kaminski, R. A. (Eds.) (2002). Dynamic Indicators of Basic Early Literacy Skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Available at http://dibels.uoregon.edu/
  • Hausknecht, J. P., Halpert, J. A., DiPaolo, N. T., & Moriarty Gerrard, M. O. (2006). Retesting in selection: A meta-analysis of practice effects for tests of cognitive ability. Retrieved from http://digitalcommons.ilr.cornell.edu/articles/13/
  • Intervention Central. (n.d.). Curriculum based measurement warehouse: Reading, math, and other academic assessments. Retrieved from http://www.interventioncentral.org/curriculum-based-measurement-reading…
  • Kekahio, W., & Baker, M. (2013). Five steps for structuring data-informed conversations and action in education (REL 2013-001). Washington, DC: US Department of Education, Institute of Education Services, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Pacific. https://files.eric.ed.gov/fulltext/ED544201.pdf
  • Kekahio, W., Cicchinelli, L., Lawton, B., & Brandon, P. R. (2014). Logic models: A tool for effective program planning, collaboration, and monitoring (REL 2014–025). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Pacific. Retrieved from http://ies.ed.gov/ncee/edlabs
  • Lawton, B., Brandon, P. R., Cicchinelli, L., & Kekahio, W. (2014). Logic models: A tool for designing and monitoring program evaluations (REL 2014–007). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Pacific. Retrieved from http://ies.ed.gov/ncee/edlabs
  • Lembke, E., & Stecker, P. Curriculum-based measurement in mathematics: An evidence-based formative assessment procedure. Portsmouth, NH: RMC Research Corporation, Center on Instruction.
  • Lichtman, M. (2010). Qualitative research in education: A user’s guide (2nd ed.). Thousand Oaks, CA: Sage.
  • Mandinach, E. B., & Gummer, E. S. (2016). Data literacy for educators: Making it count in teacher preparation and practice. New York, NY: Teachers College Press
  • McCawley, P. F. (2009). Methods for conducting an educational needs assessment: Guidelines for cooperative extension system professionals (BUL 870). Moscow, ID: University of Idaho Extension.
  • Michigan Assessment Consortium. (2015). Assessment literacy standards: A national imperative. Lansing, MI: Author.
  • Mills, L. B. (2011). Creating a data driven culture: Leadership matters [White paper]. SAS Institute Inc.
  • Morse, J. M., & Field, P. A. (1995). Qualitative research methods for health professionals (2nd ed.). Thousand Oaks, CA: Sage.
  • National Center on Response to Intervention (2013, January). Progress monitoring brief #1 common progress monitoring omissions: Planning and practice. Washington, DC: U.S. Department of Education, Office of Special Education Programs, National Center on Response to Intervention.
  • Nebraska Department of Education, & North Central Comprehensive Center at McREL. (2012). Using data to guide action for school improvement. Denver, CO: McREL. Retrieved from http://govdocs.nebraska.gov/epubs/E2000/H115-2012.pdf
  • Northwest Evaluation Association. (2014, July). Assessment matters: A comprehensive look at the value of interim assessment. Portland, OR: Author.
  • Northwest Evaluation Association. (2015, September). The path to mastery -- for every student: How do mastery measures help educators support students in RTI programs. Portland, OR: Author.
  • Northwest Evaluation Association. (2015, August). 2015 NWEA measures of academic progress normative data. Portland, OR: Author. Retrieved from https://www.nwea.org/content/uploads/2015/06/2015-MAP-Normative-Data-AU…
  • Northwest Evaluation Association. (2016, November). RIT and Lexile measure -- how they connect. Portland, OR: Author.
  • Northwest Evaluation Association. (2017, February). A parent’s guide to MAP. Portland, OR: Author. Retrieved from https://www.nwea.org/content/uploads/2016/06/Parent-Guide-to-MAP.pdf
  • Pearson. (2012). AIMSweb ROI growth norms guide. Bloomington, MN: Author. Retrieved from http://www.aimsweb.com/wp-content/uploads/roi_growth_norms_guide.pdf
  • Pearson. (2015). aimswebPlus introductory guide. Bloomington, MN: Author.
  • Pearson. (n.d.). aimswebPlus detailed product overview.
  • Pearson. (2015). aimswerbPLUS introductory guide. Bloomington, MN: Author. Retrieved from http://help.aimswebplus.com/prd/fo_plus/aimswebplus_introductory_guide…
  • Rankin, J. G. (2016). How to make data work: A guide for educational leaders. New York, NY: Routledge.
  • Rankin, J. G., Johnson, M., & Dennis, R. (2015, March 3). Research on implementing big data: Technology, people, and processes. Paper presented at Society for Information Technology & Teacher Education conference.
  • Renaissance Learning. (2014). The research foundation for STAR assessments: The science of STAR. Wisconsin Rapids, WI: Author.
  • Renaissance Learning. (2014). Using data to inform instruction and intervention: Getting the most out of STAR assessments. Wisconsin Rapids, WI: Author.
  • Renaissance Learning. (2016, August). The next generation of response to intervention. Wisconsin Rapids, WI: Author.
  • Richards, S. B., Taylor, R. L., & Ramasamy, R. (2014). Single subject research: Applications in educational and clinical settings (2nd ed.). Belmont, CA: Wadsworth.
  • Ronka, D., Geier, R., & Marciniak, M. (2010). A practical framework for building a data-driven district or school: How a focus on data quality, capacity, and culture supports data-driven action to improve student outcomes [White paper]. Boston, MA: Public Consulting Group.
  • Shermis, M. D., & Daniels, K. E. (2002). Web applications in assessment. In T. W. Banta (Ed.), Building a scholarship of assessment (pp. 148-166). San Francisco, CA: Jossey-Bass.
  • Shinn, M. R. (2013, January 2013). Measuring general outcomes: A critical component in scientific and practical progress monitoring practices (White paper). New York, NY: Pearson.
  • Shinn, M., & Langell, L. (2009). AIMSweb progress monitor training: Summer 2009 [PDF slides]. Retrieved from http://www.aimsweb.com/wp-content/uploads/LL_PMSummer2009finalhandoutve…
  • Stufflebeam, D. L., & Coryn, C. L. (2014). Evaluation theory, models and applications (2nd ed.). San Francisco, CA: Jossey-Bass.
  • Todd, A. W., Horner, R. H., & Tobin, T. (2006, May). SWIS documentation project: Referral form definitions (version 4). Retrieved from http://www.pbis.org/common/cms/files/NewTeam/Data/ReferralFormDefinitio…
  • Tomlinson, C. A., & Imbeau, M. B. (2010). Leading and managing a differentiated classroom. Alexandria, VA: ASCD.
  • Tomlinson, C. A., & McTighe, J. (2006). Integrating differentiated instruction & understanding by design: Connecting content and kids. Alexandria, VA: ASCD.
  • University of Oregon. (n.d.). easyCBM teacher deluxe user’s manual. Eugene, OR: Author.
  • U.S. Department of Education. (2006, March). Designing schoolwide programs: Non-regulatory guidance. Washington, DC: Author.
  • U.S. Department of Education, Office of Planning, Evaluation and Policy Development. (2009). Implementing data-informed decision making in schools: Teacher access, supports and use. Washington, DC: Author.
  • Van Winkle, W., Vezzu, M., Zapata-Rivera, D. (2011). Question-based reports for policy makers (ETS RM-11-16). Princeton, NJ: Educational Testing Service.
  • W.K. Kellogg Foundation. (2004). Logic model development guide. Battle Creek, MI: Author.
  • Wayman, J. C. (2005). Involving teachers in data-driven decision making: Using computer data systems to support teacher inquiry and reflection. Journal of Education for Students Placed at Risk, 10(3), 295-308.
  • Wayman, J. C., Wilkerson, S. B., Cho, V., Mandinach, E. B., & Supovitz, J. A. (2016). Guide to using the Teacher Data Use Survey (REL 2017–166). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Appalachia. Retrieved from http://ies.ed.gov/ncee/edlabs
  • Witkin, B. R., & Altshuld, J. W. (1995). Planning and conducting needs assessments: A practical guide. Thousand Oaks, CA: Sage.
  • Witt, J., VanDerHeyden, A., & Gilbertson, D. (2004). Troubleshooting behavioral interventions: A systematic process for finding and eliminating problems. School Psychology Review, 33(3), 363-383.
  • Wright, J. (n.d.). The savvy teacher’s guide: Interventions that work. Retrieved from http://www.jimwrightonline.com/pdfdocs/brouge/rdngManual.PDF
  • Wyatt Knowlton, L., & Phillips, C. P. (2013). The logic model guidebook (2nd ed.). Thousand Oaks, CA: Sage.
  • Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Carutheres, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage.