Sunday, April 28, 2013

Week 12


                        High stakes testing has become a dominant feature of the schools in the United States.  Having watched my children go through school, some before the era of No Child Left Behind, and others after, I have certainly noticed a difference.  However, besides the newspaper articles and endless debate on the ramifications of the NCLB, I have not noticed a drastic change in my children’s education.  There may have been a greater emphasis on metrics for reading in elementary school.  But otherwise I have not been aware of a great difference.  Although many teachers malign the differences that it has prompted in their manner of teaching as well as the pressure it puts on them, I have not observed the same complaints from my children.  They have noticed more the pressure that their teachers seem to feel at the annual testing time.  They have often commented, with some puzzlement, how their teachers seem particularly nervous during the weeks of high stakes testing.  Of course for many of the teachers, it is nerve-wracking.  Teachers often feel the pressure of student’s producing good test scores because this will be a reflection on them.  Some teachers have told me that much of their job evaluation rests on these tests.  And a poor administration of the tests can also be criteria for job termination in some districts.  Teachers feel enormous pressure.  But, I’m not so sure the students feel it.  For the students, finals or midterms are often much more of a high stakes testing time. 
            However when the book talked about looking at an environmental assessment when considering student achievement, I had to wonder how the stress that teachers feel during and leading up to the high stakes testing might impact students.  Students are not oblivious to the pressure their teachers under.  It is part of the equation that impacts their learning along with the many other variables that come together to foster learning.  I can certainly understand why teachers feel an immense amount of pressure leading up to testing time because these are certainly high stakes tests for them and for the school as a whole.  But it is interesting to parse out for whom the tests are really high stakes, in order to get a better handle on the adverse impact of such tests on the learning environment.  As with any assessment it certainly seems worth reviewing the purpose of the assessment, and whether it is meeting its stated intention, and whether the cost is worth the gain.  The value of high stakes testing for students (SAT, PSAT, etc.) and the value of high stakes testing for teachers and schools (such as STAR tests, PSSA’s, etc.) is certainly worth assessing repeatedly to be sure they serve the intended purpose. 

Drummond, R. J. and Jones, K. (2010). Assessment Procedures for Counselors and Helping Professionals. Upper Saddle River, New Jersey: Pearson Education, Inc.

Saturday, April 27, 2013

Week 12 Blog


Week 12 Blog    Chapter 14 & Anastasi article

Chapter 14 “Assessment in Education” did a great job of summarizing one of the primary functions of school counselors. I must admit, I was a bit overwhelmed when I read this chapter. I kept thinking, “How will I know what to do?” “Where will I look for answers on how to proceed?” I might get lucky in my placement and be working with another more seasoned counselor who can show me what me what needs to be done. I might be on my own and then I will need to be very resourceful. I think the Drummond & Jones textbook could come in handy as a reference.

One area I did not know much about was “Assessing Specific Learning Disabilities”. Drummond & Jones gave helpful descriptions of the academic areas to be aware of when assessing these specific learning disabilities. I enjoyed reading about the history of working with SDL’s and the transition from assessing this potential population with the achievement discrepancy model vs. responsiveness to intervention. (Drummond & Jones, 2010). The three-tier RIT Model seems to make sense to me although I don’t know a lot about it in practice.

I also found the section in the textbook on “Assessing Giftedness” quite interesting. I also was not very familiar with these assessments apart from the work we did with the KBIT-2. I didn’t know that assessments existed for assessing giftedness in dance, drama, music and visual art. I have a good friend who is a respected gifted teacher in a local elementary school. I know she will be a great resource for me in the future.  (Drummond & Jones, 2010).

The section in the textbook that I could relate to most was the topic of “Test-Wiseness”. When I decided several years ago to apply to graduate school, I knew I would have to take the GRE. I was nervous as I had not taken any exam like this high school and I felt very rusty in regards to my mathematical skills. I bought a study guide and for several months I went through it, made flashcards, and took practice tests. It was very helpful to hone my verbal and math skills but also to learn about the specifics of taking this particular test. All my preparation made me feel much more confident on test day and I ended up being very happy with my final GRE scores.

Assessment is an integral part of our job as school counselors. It is crucial that we know what and why we do what we do in regard to assessment. It is important to be knowledgeable on many assessment fronts and have the resources to back up our decisions as well as to help us move forward.

 

Drummond, R.J. and Jones, K. (2010). Assessment procedures for counselors and helping professionals.

                (7th ed.). Upper Saddle River, NJ: Pearson.

Thursday, April 25, 2013

Selecting Assessments, Accountability and Moderating Variables


I found Chapter 7 to be one of the most valuable chapters so far in Drummond and Jones (2010).  This chapter laid out the appropriate steps to selecting assessment instruments and strategies and provided some helpful charts to going through this procedure.  The list of assessment resources provided a very thorough foundation as to where you can begin to look for different assessment strategies.  As a school counselor, I will be able to benefit from lists such as this when I am looking for the best resources to use for each of my situations.  I also think I will be able to use the steps to evaluating an assessment instrument.  These pieces of information have given us important things to consider and look for when determining which assessment to use for our students.

I felt that this Chapter from Drummond and Jones (2010) related quite nicely to the Studer, Oberman & Womak (2006) article which talked about a school counselor's accountability.  "Accountability means 'being responsible for one's actions...and documenting effectiveness through measures of professional activity outcomes'" (Studer, Oberman, & Womak, p. 2, 2006).  A school counselor needs to persist in their field to make sure that other professionals within their building understand the importance of their job.  One of the ways to stay accountable is to assess your students to collect data and find where you are benefiting them.  Because of the large number of assessment instruments and placing you can find them, it is crucial that school counselors understand how to evaluate an assessment and consider the best tool for each situation.  Chapter 7 demonstrated these needs to help improve a school counselor's accountability.  Something I did find interesting from this article was the different views school counselors have had previously on accountability.  Part of the problem with our field is that so many school counselors have not collected data appropriately because they are concerned with what this accountability will do to their current position.  In particular it could show we are not as beneficial as we hope or that we are so beneficial that we would be expected to do more.  I think these are both areas we need to risk however in order to benefit our students.  We know we can help and we should fight for that right. 

On a side note, I wanted to briefly mention my experience with the Piers-Harris 2 Self-Concept Scale.  I thoroughly enjoyed having the opportunity to explore an instrument so closely and even have the chance to administer and score a test.  When Dr. Baker had brought up moderating variables, I thought I would look into it further in my manual.  They define moderating variables as "a variable that may affect scores on a test independently of the construct that the test is supposed to measure" (Piers & Herzberg, p.20, 2002).  During the revision of the second edition of the Piers-Harris, the researchers paid particular attention to looking at these possible moderating variables such as sex, ethnic groups, and SES.  Because the researchers were able to use a larger and more normative stratified sample based on the US population, these variables are taken into account somewhat.  What they have found is that this test can be used for different groups of people without those variables specifically affecting the scores.  It was mentioned though that if a particular moderating variable is of concern that more focus should be placed onto that variable and determining its influence.  I am glad that I had the chance to review this more closely.


Drummond, R. J. and Jones, K. (2010). Assessment procedures for counselors and helping professionals. (7th ed.). Upper Saddle River, NJ: Pearson.

Piers, E. V., and Herzberg, D. S.  (2002).  Piers-Harris 2: Piers-Harris children's self-concept scale.  (2nd ed.).  Los Angeles, CA: Western Psychological Services.

Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing evidence to show counseling effectiveness in the schools. Professional School Counseling, 9 (5).
 

Blog 11: Assessments


This chapter on selecting, administering, scoring, and interpreting assessment results was familiar due to the testing experience that we had.  I feel as though the hands on experience of actually working with an assessment and going through the proper steps helped me more than just reading about it.  I have always been the type of person that I do better with hands on assignments.  Even though the testing instrument was chosen for us, as future counselors it is important to choose the appropriate assessment needed.  It is imperative to determine the information that you are looking for before choosing an instrument.  Drummond and Jones (2010) brought up a valid point regarding identifying the available information.  When evaluating a child, you want to determine what you already know and look at the existing assessment information that is already out there.  It is hard enough for students to take a test the first time; therefore, you do not want to administer the same test or type of test again.  This could lead to a lack of motivation and with my experience with teaching in the elementary school, the students are tested extremely too much.  There needs to be a purpose and I believe you should make them aware of why you are giving them this assessment.  In addition, when evaluating the assessment you want to look at the time required to administer, ease of administration, scoring, interpretation, format, readability, and the cost.  As a counselor you may have limited time to pull these students out of class and you want to make sure it goes as smoothly as possible. 

            In addition to selecting the test, the administering is very important.  I thought that Drummond and Jones (2010) did a nice job breaking down the before, during, and after.  It made it a lot easier to become familiar with what you should be doing when administering the test.   The chart that was presented was a great check list that you could post and have readily available if you are asked to administer a test.  Next, with scoring, it is important to take your time.  When reporting results you want to be accurate in the information that you are providing to the parent and child regarding their levels.  These results could affect their placement in particular classes so you want to ensure that you are accurately scoring the assessment.  In conclusion, I feel as though the information presented allowed me to have a clear understanding of the important factors when going through the process of assessment.   
 

Drummond, R. J. and Jones, K. (2010). Assessment procedures for counselors and helping professionals. Upper Saddle River, New Jersey: Pearson Education, Inc.

proving our effectiveness



When it comes to assessments, school counselors, and professionals alike should all really be held to the same standards and keep each other accountable.  In the article by Studer, Oberman, and Womack, (2006), there was a lot of focus on how school counselors will prove that we are being effective. While I do not disagree that we do need to be making sure our methods and practices are effective, the school system as a whole needs to operate as a functional unit. Unfortunately, since we work so closely with other school officials and have guidelines placed upon us by the department of education, school counselors cannot be the only ones that are expected to be accountable. Lack of accountability across the board will affect many aspects of education, our jobs included. 

               As a previous graduate assistant for the department of Housing here at Millersville, there unfortunately was a lack of consistency and accountability which has given me some first- hand knowledge of the difficulties this can sometimes create. I am actually working through an example of this at this current time. It was brought to my attention by fellow, still current, graduate assistants that at a GA staff meeting only, my supervisors wife made some unprofessional and sarcastic comments about me last Wednesday. Its not so much what she said but that even after writing the letter I am currently crafting about this unprofessional business, most likely, nothing will be done about this. These types of situations where there is no accountability by supervisors makes the structure itself weak. 

               I think that Studer et al (2006) make some valid points. The most poignant of which being that we, did not heed the warnings and make appropriate changes to prove our value early on and as a result were assigned non-counseling duties. I think any active school counselor who finds themselves administering PSSA’s or any of the other non-counseling duties can agree that we still struggle with being seen for what our true purpose is today.  I love that this program will prepare us to fight this stigma that we aren’t really useful for a lot of things by having us create a presentation for interviews about what we are prepared to implement. I think that more aggressive strategies like this are paramount to changing how we as future school counselors are seen and respected in the field.

Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing Evidence to Show Counseling                Effectiveness in the Schools.  Professional School Counseling, 9 (5).

Selecting and Designing Assessments

Because of my participation in this course, I have done a lot of thinking about my role in administering and interpreting assessments as a school counselor. Until reviewing this week’s readings, though, I did not really think much about my responsibilities for selecting and designing assessments. The textbook provides some really valuable resources in this area, confirming my assumption that this book will be one that I keep close at hand for future reference in my practicum and job. The text outlines five important steps for selecting instruments, including identifying the type of information needed, identifying available information, determining the methods for obtaining information, searching assessment resources, and evaluating and selecting an assessment instrument or strategy. I especially liked the various resources described in the assessment resources section, such as the Mental Measurements Yearbook, Tests in Print, and the various assessment journals, as these manuals are good starting points for collecting general, unbiased information about specific assessments or categories of exams. I also appreciated the questions suggested for evaluating and selecting an assessment instrument or strategy, as I can envision myself using this as an initial checklist when I am presented with this particular task. The article, “Producing Evidence to Show Counseling Effectiveness in the Schools,” reminds us that we will also be responsible for evaluating our own counseling programs. It was interesting to read about some of the interviewed counselors’ fears of accountability as I have had many of these same thoughts throughout the progression of this course. Similar to the information presented in the text, this article also describes some practical questions that a counselor should consider before selecting an assessment. In this discussion, I really liked the suggestions for question formation, as this provided some good “dos and don’ts” for designing assessments that I think I could follow. I also found the practical examples useful, especially the description of the middle school counselor who was tasked with designing a leadership group for 22 eighth-grade students. The process the counselor engaged in, such as conducting a literature review, locating an assertiveness test, using a pilot study to gauge the effectiveness of the assessment, and designing pre-test and post-test Likert scales, are all activities I can see myself engaging in while at my school. Additionally, presenting this information in this practical way makes the process seem less intimidating and actually achievable. Drummond, R., and Jones, K. Assessment Procedures for Counselors and Helping Professionals. (2010). 7th Edition. Upper Saddle River, NJ: Pearson Education, Inc. Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing evidence to show counseling effectiveness in the schools. Professional School Counseling, 9 (5).

Blog #11

            After reading the article by Studer, Oberman, and Womack, I started to think about all of the reasons why it would be beneficial for school districts have specific accountability policies in place.  I have learned many important lessons throughout my short working career.  One of those lessons involves valuing the importance of accountability.
            Even though I agree that accountability practices need to be put in place for a school counselor, I also believe that accountability needs to be looked at consistently across the board.  For example, if I am expected to live up to a certain standard as a school counselor, then administration should also be expected to live up to a similar standard.  With that said, in order for consistent accountability practices to be put in place, clear and concise expectations need to be laid out for every position.  I believe that each position should have a detailed job description that lists all of the expectations of the job.  I also believe that each job description should list all of the ways in which each person is held accountable for their performance within each specific job.  After an accurate and clear school counselor job description is developed, it should be the job of the administration and the school counselor to make sure that the job description is consistent with the work that the counselor is doing.  For example, if the school counselor is spending the majority of their time doing clerical work, scheduling, etc, the job description and the accountability practices should be consistent based on the actual work the counselor is doing, not what the counselor is supposed to be doing.
            I personally believe that everyone should hold each other accountable when working in a school.  Sensitive open communication can go a long way, while passing judgment is always unacceptable.  I am also a huge believer in holding myself accountable in all aspects of my life.  So when the article talks about assessing your own counseling practices to measure the overall effectiveness, I am completely supportive of it.  When I eventually do become a school counselor, I want to be the best counselor I can be.  Only good things can come from gaining feedback from my students.  As a counselor, my goal will be to constantly want to improve day to day.  By assessing students for feedback, not only would that show administration that my counseling techniques are hopefully effective, it would also show the students that I am actually taking ownership of my job as their counselor. 

Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing Evidence to Show Counseling Effectiveness in the Schools.  Professional School Counseling, 9 (5).

Tuesday, April 23, 2013

Week 11


            It seems very sad to me that, given the depth and breadth of issues in our world today, especially in the face of many recent tragedies, counselors need to defend their importance to a school district.  If anything, I believe that schools should have more counselors and mental health services in general.  However, the education budget crises in this state and many others unfortunately warrant such a defense, and solid data is the only way to do so.  This week’s article seems to be a helpful reference for tackling this task in the future.

            I understand the concerns behind reluctance to use data-collection methods to show accountability.  One of my biggest concerns would be the fact that administrators are reluctant to approve of data collection of students due to the belief that they are not reliable sources for obtaining information.  In my mind, this goes back to our discussions in class about the lack of self-awareness or inaccuracy of self-reporting.  Students also sometimes show bias in questionnaires or rating scales based on how much they personally like the teacher or counselor involved.  I think that, to more successfully conduct such research, collateral sources, especially parents and teachers, should very often be used.  Time is another huge factor.  I have thought for the past couple of years of conducting action-based research with my students. However, with being pulled in many different directions at once, having a constant flow of paperwork, and with working toward my Master’s degree, actually doing so has been difficult.  I have done action-based research on a small scale for my guided supervision goal.  Two colleagues and I have used data to show effectiveness of our after school Homework Club, although it is not nearly as comprehensive as I would like it to be.  To do so would take much more time than we can feasibly spare.

            Assessment measures must be carefully chosen, as discussed in the textbook.  In addition to the factors discussed, it seems like another factor in what assessment to utilize would be what is available in your school district, and what you are told to use for different reasons for assessment (i.e. our district uses the WISC-IV for all gifted testing).  One factor to consider, as the book discussed, is the format of the assessment. Assessment format is a very important factor, and not just for standardized assessments.  This goes back to my question to Matt last week about the Self-Directed Search.  Students with learning disabilities and with ADHD greatly benefit from larger font, chunked and well-organized material, and extra white space.  It helps them to organize and process information; otherwise, they will often give up out of frustration just because something looks difficult.  If they have to put more effort into processing and understanding an assessment tool than they are willing, they will be less likely to give accurate information, if they provide any information at all.

            Data may also not show any kind of effectiveness, although this could be due to a variety of factors.  In my experience, student motivation is a huge factor.  The example given in the article relates to study skills and a pre- and post-assessment in the form of a questionnaire.  I used to teach a study skills class, and getting students to want to learn and apply the skills was quite a challenge.  While it was helpful for students who truly wanted to improve their study and organizational habits, it did not make a lick of different for those who did not want to be there.  Just like in counseling, progress is difficult to make if the client does not want to make a change.  In my current experience with Homework Club data, the data overall does not show effectiveness of Homework Club; however, this is data based on students who attend fairly regularly, and if they did not attend, I would presume that their work completion rates and grades would be much lower than they are.  I believe that it helps them stay “afloat.” The data suggests that perhaps we should put more supports in place (i.e. run small groups during homework club that focus on specific study and organizational skills) but if not interpreted carefully and with other considerations in mind, it could be a reason to shut down homework club completely.


References

Drummond, R. J. and Jones, K. (2010). Assessment procedures for counselors             and helping professionals. (7th ed.). Upper Saddle River, NJ: Pearson.

Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing evidence to             show counseling effectiveness in the schools.  Professional School             Counseling, 9 (5).

Post 11 - Assessment and Accountability in School Counseling


I found both a challenge and encouragement as I read Producing Evidence to Show Counseling Effectiveness in the Schools (Studer, Oberman, & Womack, 2006) this week.  I was challenged to become an accountable counselor by documenting measures of professional activity outcomes.  Failing to provide this type of documentation can lead to dire outcomes, yet Studer, Oberman, and Womack (2006) assert that school counselors can be reluctant to engage in accountability procedures for many reasons.  Counselors may hesitate to engage in accountability procedures out of fear of being overburdened with extra work, fear that they do not have the necessary skills to conduct research, fear that accountability would negatively impact their performance, fear of litigation or the belief that students are not reliable sources for obtaining information, or due to lack of time, funds, or additional resources.  While Studer, Oberman, and Womack never use the word “fear” when listing the reasons why school counselors may shy away from accountability measures, I found fear to be an underlying theme in all of their rationale.  Perhaps fear stood out to me because I, like other counselors, am fearful that I might not be able to design an assessment instrument that would produce dependable results.  However, failing to produce any type of assessment instrument for programs that I might create or oversee would probably create larger issues.  If I’m not assessing students both during and after my program(s) I will not know how my program(s) could be improved to better meet students’ needs.  As I work on developing a program for my graduation project, I’m sure I will want to do some formative evaluation in order to figure out what I will need to tweak before potentially presenting my project at a job interview and/or implementing it in my school. 

I found the discussion on action-based research to be encouraging as I consider how to go about working on my graduation project.  The steps of identifying an area of concern, collecting data, analyzing the data, and developing a plan seem practical and somewhat familiar.  While I’m not a scientist by any means, the process of action-based research reminds me of the scientific method.  First, you make a hypothesis (identify an area of concern), then you test your hypothesis (collect data), you analyze your data, and finally decide what you need to do next (develop a plan).  Are your findings significant?  If so, how?  Is further testing needed to assess another variable?  These are all questions that one might also ask during the action-based research and evaluation process.  As I return once again to the topic of the large project looming ahead of me, I have a better picture of what steps I will take to develop a plan (program) that will actually benefit a specific population of students, and the task no longer seems quite so overwhelming and scary. 

Reference
Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing evidence to show counseling effectiveness in the schools.  Professional School Counseling, 9 (5).

Week 11


            This week’s article makes a very important point about the need for counselor’s to show not only what they do, but what difference it makes that they do it.  This is completely in step with the trends in education and other fields where programs, employees, and professionals are expected to be accountable for the effectiveness of their work.  Although this accountability can be seen as onerous and an unnecessary demand, it can also be of great value.  I think it does help administrators, parents, and teachers come to value the work of counselors when counselors can illustrate the effectiveness of what they do.  Additionally, effective assessments prompt counselors to be very clear as to what they hope to accomplish through the counseling program or programs.  This therefore gives focus and intentionality to any undertaking.  If a counselor can’t articulate what he/she hopes to accomplish, he/she cannot effectively measure it.  I also believe that formative and summative assessment tools are essential for improving and tweaking any endeavor.  Assessment enables school counselors to stop flying blind and chose wisely when it comes to where and how to spend their limited time and resources. 
            Because funding is often very limited, counselors may want to apply for grants, which typically require proof of effectiveness. In addition when asking parents or students to participate in any programs, charts and illustrations demonstrating past effectiveness are likely to garner greater interest in the program.  In today’s society, many people don’t participate in something simply because someone says it is a good idea.  People often want to know what the track record has been and what the specific intended result will likely be.  
            Counselors can use paper/pencil assessments, or they may want to utilize some of the current technology such as Survey Monkey, which makes assessing, fun, easy, and quick.  Survey Monkey has a quick step-by-step process to design assessments and tabulates results.  It is a free on-line service and a link to a particular survey can be emailed to people very easily.  
Additionally, the effectiveness of assessments can be greatly enhanced by charts and illustrations.  People generally don’t have the time to read all the words in a report, but they will remember charts and graphs. Many students and parents are unsure of what counselors do.  Assessment results can be a wonderful way to educate others about the possibilities and the good work that counselors do in the schools.  Rather than see assessment as a burden, counselors could view assessment as a wonderful tool and opportunity. 

Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing evidence to show counseling effectiveness in the schools. Professional School Counseling, 9(6).

Friday, April 19, 2013

Week 11 Blog


 

Chapter 7 and the article by Studer, Oberman, and Womack have a lot of practical and sage advice along with many helpful charts and checklists. I can see myself referring back to these works in the future. I must admit that this area of assessing our school counseling programs makes me a bit nervous. The article really pointed out that school counselors are doing a disservice to their programs and the profession by not showing evidence of effectiveness. “Unfortunately, because school counselors did not heed the early warnings to actively demonstrate success, programs and personnel were eliminated.” (Studer, Oberman, & Womack, 2006). With cutbacks and consolidations, proving our services are invaluable is crucial.

I enjoyed reading the descriptions of the examples of action-based research that is described in Studer’s et al article. I can see where they could all play a place in conducting research. I got a laugh reading about the story of “Kagan the Dragon” and how after much time and money spent working in a Hispanic neighborhood using this tool, it was found that “Kagan” can translate into “pooper” in Spanish. I laughed but also realized how easy this could happen when creating an instrument and how important it is to keep the gender/cultural/ethnic group makeup in mind when designing your instrument and testing its reliability. (Studer et al. 2006).

Drummond’s textbook highlighted sources of assessment instrument information which I used to gather more information on the testing instrument I will be presenting to class soon. I found some great reviews of the instrument I chose and they raised questions and concerns I had not previously considered. I found the Mental Measurements Yearbook a very helpful reference source. (Drummond & Jones, 2010). I also found the rest of the chapter very helpful in preparing for, administering the assessment, and interpreting the assessments with my client. I feel it is important to keep these recommendations fresh in your mind as you work with clients and various assessment tools.

 

Drummond, R. J. and Jones, K. (2010). Assessment procedures for counselors and

            helping professionals. (7th ed.). Upper Saddle River, NJ: Pearson.

Studer, J.R., Oberman, A.H., and Womack, R.H. (2006). Producing evidence to show

            counseling effectiveness in the schools. Professional School Counseling, 9(6).

Thursday, April 18, 2013

Chapter 11 Career and Employment assessments



               In this chapter (career and employment assessment) it discusses different types of career assessments and factors that affect career choices. I found this chapter to be very interesting as I really enjoyed taking my Career class this summer. I was particularly drawn to the section about Holland’s SDS. This past Saturday, I joined the office of Career Services (even though this is not their P.C. name anymore) in facilitating Hollands SDS and a discussion for new potential students who had not chosen a major yet. It was a lot of fun especially since we split students into their interests first and then discussed different code types and realistic major choices for them. I know when I took the SDS it was very accurate for myself.  As school counselors career direction is an essential part of our job. It is important to understand the limitations that our clients will bring to our tables. Without being well versed in basic Career Assessments we will not be able to successfully guide our population of students in the right direction. 


               For as skeptical as I was when I started this class I am really starting to fully appreciate the number of assessments I have become familiar with as well as understanding the importance they bring to the school counseling field. I was surprised to read that testing does not discriminate against minority groups or women. I would tend to think that due to language barriers and or cultural differences there would be discrepencies and as a result some sort of bias. While this does exist it does not seem to be as prevalent as I always thought. As a lot of the chapter dealt with Career and Personnel testing it reminded me a lot of the article I presented on. 


               I think it was interesting that employment tests are more uniformly used in federal and state agencies but it makes sense. Since big companies need to function like a well-oiled machine; how better to do this than by standardizing the evaluation and testing procedure?

Drummond, R. J. and Jones, K. (2010). Assessment procedures for counselors and helping professionals. Upper Saddle River, New Jersey: Pearson Education, Inc.

Career Assessments

I was excited to read about career and employment assessment this week because I feel as if I have some expertise in this area having worked for the Office of Experiential Learning and Career Management on campus since last fall. I also chose to review a career assessment for my testing presentation, so I have been especially immersed in the world of career services over the past week. I reviewed the MRP, which is a college and career planning system for high school students. We use a version of this program for high school students in our office called the FOCUS 2. Although I think the use of FOCUS is more widespread than the MRP, I found the MRP to be more easily accessible and to also contain more resources. The system also generates user reports and system-wide data so I think it is really useful for school counselors as they are having conversations with students and parents about their future plans and reporting school-wide data on student plans after graduation. I was also comforted by the fact that this system was developed from a strong research base, providing inventories based on Holland Codes and pulling in information from other assessments, such as the Strong-Campbell Interest Inventory, all of which are described in the text. I was happy to see that the text also included information about interviews as this is a very important part of the job assessment process. As mentioned in the book, we have heard that situational interviews are becoming very common, during which applicants have to describe ways in which they have responded to different incidents in their previous jobs. We have learned a technique for responding to these sorts of questions, called the STAR method, which students have found helpful. There are four steps, each associated with one letter in the acronym. First, describe the Situation or Task you needed to accomplish. Try to describe a specific event, not a generalized description and be sure to give enough detail so for the interviewer can understand. This situation can be from a previous job, a volunteer experience, or any relevant event. Then describe the Action you took and be sure to keep the focus on you. Even if you are discussing a group project or effort, describe what you did-not the efforts of the team. Don’t tell what you might do, tell what you did. Then describe the Results you achieved. What happened? How did the event end? What did you accomplish and/or learn? It may benefit the applicant to practice describing specific incidents using this method in advance of the interview. Although you may not know exactly what the interviewer will ask, you can have a good idea of questions that are asked in most interviews and can have certain responses prepared. Mock interviews can also help you determine how you may behave during a formal interview session. Belludi, Nagesh. (2008). “Use the Star Technique to Ace your Interview.” Right Attitudes Blog. Retrieved from http://www.rightattitudes.com/2008/07/15/star-technique-answer-interview-questions Drummond, R., and Jones, K. Assessment Procedures for Counselors and Helping Professionals. (2010). 7th Edition. Upper Saddle River, NJ: Pearson Education, Inc.

Wednesday, April 17, 2013

Interest Inventories

After reading Chapter 11 in Drummond and Jones (2010) this week, I was taken back to my first year of graduate school when I took Career Development.  In my first semester in the school counseling program, I was introduced to many assessment techniques that I may never have fully understood until taking Statistics again and now Appraisal.  I appreciated being able to take a different perspective on some of the testing instruments that I had already learned about.  In particular the interest inventories stuck out to me because I have so often taken similar tests.  It provides such a simple test that any type of counselor can use as a starting point for class schedules, extracurricular activities, majors, and careers.  They are easy to use and allow the individual to be provided with multiple options and areas to explore.  Often times people have no clue where to start looking for a career field they may want to be in, but interests are the perfect place to start.  As was pointed out in Drummond and Jones (2010) interest can be related to motivation which is such a powerful aspect of success in school, career, and life.  When I have a lack of motivation, I find it difficult to accomplish anything even those things that are required of me.  Having strong interest in what I am doing allows for greater motivation and success in my life.  To me it seems clear that interest inventories can provide such a powerful foundation to build a person's career and life satisfaction on.

At the end of the chapter I found an interesting discussion question that I wanted to address.  "What do you think is the best way to find out someone's interests: (a) using an interest inventory, or (b) asking the person, 'What are you interested in, or what would you like to do or be'" (Drummond and Jones, p. 244, 2010)?  I think before graduate school I would have answered this question very differently but through the experiences I have gained over the past two years and my classes it has become more clear to me how beneficial things such as testing instruments can be to someone.  My opinion is that interest inventories can be much more successful at helping an individual determine their interests than just asking someone what they like doing.  Often times people believe they understand what they like and don't like but may not understand everything that can fall into their interests.  For example, the students I work with have such a difficult time choosing a major or even classes that they want to take each semester.  I ask them what it is that they are interested in and they can't easily describe anything.  I get responses such as video games, watching TV, shopping or playing basketball.  These are all interests for sure and great things to enjoy doing but they tell us very little about those individuals.  An interest inventory can provide more detail and in depth questions to find out things like whether a student wants to work outside or inside, with people or alone, at a desk or moving around, with numbers or with words.  Often times individuals do not even consider all of the areas that they have strengths in or enjoy doing because they are unsure of how to describe everything.  Asking someone what they are interested in is such a broad question and a difficult one to answer quickly.  Interest inventories can assess these types of questions without the individual even realizing and provide multiple categories and areas to look into further. 

I believe I am starting to fully appreciate the help that assessment tools can provide to someone such as a school counselor.  With such a busy job and career, it is difficult to get the full picture of a student without using more efficient and time saving methods such as testing instruments.  Interest inventories are just one of the many ways that school counselors can successfully use assessment to better their effects.

Drummond, R. J. and Jones, K. (2010). Assessment procedures for counselors and helping professionals. Upper Saddle River, New Jersey: Pearson Education, Inc.