 Uncategorized
			
							
					
						Uncategorized					
				
					Woodcock Johnson IV
I have been using this test for one year now; here are my personal musings. What do you think of it? Are you considering using it? https://educationaccess.co.uk/woodcock-johnson-iv/
 Uncategorized
			
							
					
						Uncategorized					
				
					I have been using this test for one year now; here are my personal musings. What do you think of it? Are you considering using it? https://educationaccess.co.uk/woodcock-johnson-iv/
 Uncategorized
			
							
					
						Uncategorized					
				
					SASC guidance, that may have been missed about omitting the SDMT when testing: please see: https://educationaccess.co.uk/symbol-digit-modalities-test-sdmt/
 Uncategorized
			
							
					
						Uncategorized					
				
					
Having read this article questioning the existence of dyslexia, I understand his opinion about the inequality of diagnosis (and one that I grapple with) but in trying to formulate my response about his other ideas, these people say it so much better than me.
So please take a look at this response from Behavioural optometrist Irfaan Adamally on oculomotor dysfunction, Dr Neville Brown on the success of Maple Hayes Hall school, and Prof Brian Butterworth on the even more severe consequences of dyscalculia.
Ann Bergin also wrote a personal response.
Happy reading and please do get in touch if you have any questions about dyslexia or dyscalculia.
 Uncategorized
			
							
					
						Uncategorized					
				
					
In my role as an APC assessor, I have found there are many common errors that are made when submitting a report for a submission for APC, when marked against the marking criteria.
A common difficulty I have noted is in regards to testing too much (“over-testing”) or not testing enough. Examples of this could be:
This is all part of marking criterion 4 which states:
Age-appropriate assessment materials have been chosen to cover all core components as relevant to the SpLD(s) under consideration …
It may be that, despite average phonological awareness core subtest scores, difficulties are evident elsewhere. For example, in the background information, qualitative observations or in the attainment tests. In these cases, it can be of benefit to administer the supplementary phonological awareness subtests. This can (as the manual says) “yield much useful information about a person’s strengths and weaknesses”, providing us with further information and strengthening a potential diagnosis.
It is possible, in the core subtests, that an individual’s difficulties can be masked by the successful use of other abilities (such as logic and prediction, dare I say guessing?), compensatory strategies, such as visual memory, or the positive impact of powerful interventions during earlier schooling. These abilities are not available when blending and segmenting non-words; in the supplementary tests there is less scope for using these strategies as the words are unknown.
In these cases, I would explain why this additional testing was carried out, e.g. Due to their reported difficulties in this area, further phonological awareness testing was carried out to investigate in more depth.
If the core phonological awareness subtests demonstrate below average scores, I think that to administer the alternative subtests may well be over-testing as you already have enough information.
SASC guidance asks us to report on working memory. One way this can be assessed is via the ACI Index in the TOMAL 2 (other assessments are available 🙂 ) Therefore, we have to be careful when adding additional testing from the TOMAL 2 that is not related to working memory (even though it potentially provides very good information). The essential tests we have to administer are already very comprehensive. We have to be mindful of the fatiguing effects of over-testing and the impact of this on the reliability of the results. This is not to say other tests cannot be used, but they need to be justified.
Bearing that in mind, it may be worth considering (but it is absolutely not expected or essential) carrying out the Visual Sequential Memory Subtest. This provides us with a composite score for Sequential Recall Index. If the Manual Imitation score is markedly higher than those of the verbal tasks, it may demonstrate that visual sequential memory is stronger than verbal memory. This provides us with further evidence of a specific weakness with verbal memory (one of the characteristic features of dyslexia), whilst also providing us with a potential strength on which to build.
Again, we would explain why this additional testing was carried out, e.g. further testing was carried out in order to investigate a possible strength with their visual memory and to further evidence a specific weakness with verbal processing, known to be associated with dyslexia.
For a pre-16 report, the SASC guidance tells us a copying task might also be given so that difficulties relating to motor skills and the process of composition can be teased apart. This is especially important if the background information, our own observations and other assessments (e.g. SDMT, Diamonds) have highlighted difficulties with fine-motor skills. If you are using the DASH, then this activity is provided for you and can be used qualitatively if the person is below 9 years of age. It does not need to be a formal assessment as the qualitative observations will provide us with information. If you have the DASH, I would recommend carrying out the 4 core subtests to be sure of covering all core components as relevant to the SpLD, whilst also supporting a possible onward referral for further investigation into motor skills. As the manual explains, “many children with poor fine and gross motor skills also have handwriting difficulties and may perform poorly on the DASH…further investigation will be helpful to quantify and describe their level of performance on a range of motor tasks.”
This will help ensure we are covering some of the core components, whilst not going too far and straying into ‘over-testing’, and will help inform our diagnostic decision. This also forms part of marking criterion 4 in the APC Review Proforma that all SpLD APC awarding bodies are using when reviewing an APC (The Dyslexia Guild, BDA and PATOSS).
 Uncategorized
			
							
					
						Uncategorized					
				
					
In my role as an APC assessor, I have found there are many common errors that are made when submitting a report for a submission for APC, when marked against the marking criteria.
The fifth common error is not using the visual screener, which is part of marking criterion 3 which states:
A range of background information has been gathered from a variety of sources and that this has been used to inform the assessment and the diagnostic decision.
One useful piece of background information is that relating to a student’s visual abilities, in order to differentiate between visual problems and specific learning difficulties. Visual screeners can contribute useful information which can be captured in the Additional diagnostic evidence and information section of the report. Even if no visual difficulties are mentioned in the preliminary information gathering, a visual screener carried out by the assessor may bring issues to light which would otherwise be overlooked. For example, a child may notice letters moving on the page, but not think to mention this as it is perfectly ‘normal’ for them. As a result, this information would not be available to the assessor who has not administered a visual screener.
The one I use is the visual screener (Dr Jim Gilchrist, Caroline Holden, Jane Warren), as disseminated by SASC, and its purpose is to:
The questionnaire includes a number of questions addressing different aspects of possible visual difficulty. There are two sections to the pre- 16 years questionnaire: one for parents/carers and one for the child. Where involved, teachers could also provide corroboration, in the classroom context, of any issues noted.
Parental involvement in and permission for the use of this pre-16 years screener is necessary.
If no difficulties are noted and no further action is required, this is still stated to demonstrate you have considered this aspect. Where there are indicators of visual difficulties (discomfort and disturbance), these must be noted but not diagnosed and the assessor should describe routes to further assessment with a qualified vision practitioner, e.g. optometrist. For further support with filling in the questionnaire, there is additional guidance and the presentation from the SASC conference in June 2019.
This will form one element of the range of background information you consider, which will help inform your diagnostic decision and is part of marking criterion 3 in the APC Review Proforma that all SpLD APC awarding bodies are using when reviewing an APC (The Dyslexia Guild, BDA and PATOSS).
I have migrated the original PDF into a Microsoft Word version of the pre-16 visual screener.
Part 6 of this series reports on the use of possible over-testing.
 Uncategorized
			
							
					
						Uncategorized					
				
					
In my role as an APC assessor, I have found there are many common errors that are made when submitting a report for a submission for APC, when marked against the marking criteria.
The fourth common error is around calculating the composite score for Phonological Memory as part of the CTOPP-2 (ages 7-24). Correct calculation of scores is part of marking criterion 5 where it states:
Information in the report reflects that tests have been administered correctly and all scores are calculated, converted and reported with 100%. accuracy.
I have observed instances where the confidence interval used for Phonological Memory has been 8 (which is the value used for Phonological Awareness, Rapid Symbolic Naming and Alternative Phonological Awareness), but it should be 12. I surmise that people may have overlooked the fact that this one composite score has a different Standard Error of Measurement (SEM) from the others in the Composite Performance section.
There is not such as emphasis on confidence intervals in the new SASC formatting but SASC are very insistent that, even though Confidence Intervals do not need to be used, we need to know how to work them out. As the SASC Additional Guidance explains, confidence intervals may be included to indicate test reliability. However, confidence intervals cannot be used to compare test scores unless tests are co-normed.
95% confidence interval = SEM x 1.96
(and round up or down to nearest whole number).
In other words: 6 x 1.96 = 11.76 which is rounded to 12.
So, the confidence interval for Phonological Memory is +/-12
If your composite standard score is 80 for Phonological Memory (for example), the Confidence Interval will be 68-92.
To avoid the common misconceptions associated with their use, confidence intervals should be explained carefully in Appendix 1 of your report. This helps with accessibility, which is part of marking criterion 15 in the APC Review Proforma.
In this way, you have demonstrated that this confidence interval is calculated correctly, which is part of marking criterion 5 in the APC Review Proforma that all SpLD APC awarding bodies are using when reviewing an APC (The Dyslexia Guild, BDA and PATOSS).
Part 5 of this series reports on the use of the visual screener.
 Uncategorized
			
							
					
						Uncategorized					
				
					
In my role as an APC assessor, I have found there are many common errors that are made when submitting a report for a submission for APC, when marked against the marking criteria.
The third common error is around relating the scores to the average. This is part of marking criterion 8 where it states:
Scores are related to the average with consistency and unexpected differences in performance are acknowledged and discussed.
Relate performance to a level descriptor and you may wish to note the standard score achieved in brackets.
In this way, you have demonstrated that your scores are all related to the average with consistency, which is part of marking criterion 8 in the APC Review Proforma that all SpLD APC awarding bodies are using when reviewing an APC (The Dyslexia Guild, BDA and PATOSS).
Part 4 of this series reports on the confidence interval for Phonological Memory, CTOPP 2.
 Uncategorized
			
							
					
						Uncategorized					
				
					
In my role as an APC assessor, I have found there are many common errors that are made when submitting a report for a submission for APC, when marked against the marking criteria.
The second common error is around confidentiality. The issue of confidentiality is marking criterion 1 where it states:
Confidentiality is maintained throughout the report.
In this way, you have demonstrated confidentiality, which is marking criterion 1 in the APC Review Proforma that all SpLD APC awarding bodies are using when reviewing an APC (The Dyslexia Guild, BDA and PATOSS).
Part 3 of this series reports on relating performance to a level descriptor.
 Uncategorized
			
							
					
						Uncategorized					
				
					
In my role as an APC assessor, I have found there are many common errors that are made when submitting a report for a submission for APC.
The first one is the order of tests as they appear in appendix 2 and appendix 4 in the report, when marked against part of the marking criteria 15: accessibility.
It says in the SASC guidance for appendix 4 (Test References and Descriptors):
In an accessible format and preferably arranged in the order presented in the assessment report.
Although it does not say this for Appendix 2 (Table of Results) the message is that these should also be presented in the order presented in the report. Therefore the order for Appendix 2 and 4 should be:
In this way, you are considering accessibility for the reader, which could be considered as part of marking criteria 15 in the APC Review Proforma that all SpLD APC awarding bodies are using when reviewing an APC (The Dyslexia Guild, BDA and PATOSS).
Part 2 of this series covers confidentiality.
 Uncategorized
			
							
					
						Uncategorized					
				
					#BDAdyscalculia The BDA maths conference reinforced the idea that when investigating dyscalculia we are looking for that core deficit in estimating the number of objects in a set and the link between efficiency of dot enumeration and arithmetic. Well summed up by the Havana study “Inefficient dot enumeration itself almost guarantees poor arithmetic.” So this certainly has to be the focus (alongside other cognitive and literacy based assessments) when assessing maths to determine if it is dyscalculia or more of a general maths difficulty, or indeed linked to other SPLDs, further supported by Sarah Wedderburn’s comments that numerosity is the ‘root’ of the maths tree. I enjoyed the webinar by Janet Goring and plan-do-review approach, which backs up my thinking that this needs to be done with maths- schools use this a lot for literacy and the ‘well-founded intervention’ feeds into the dyslexia diagnosis as per the Rose report- but highlights the importance of this for maths: assessment needs to be based not only on formal assessment but on their response to intervention, as also mentioned by Butterworth, who stated that response to intervention is useful in deciding if difficulties are dyscalculic in nature. hashtag#bdadyscalculia