ATA’s Certification Exam Preparation Workshop in Boston

Reblogged from The ATA Chronicle, with permission

ATA’s Certification Exam Preparation Workshop presented opportunities for participants to learn how the Certification Program works, including the general characteristics of exam passages and how exams are evaluated and graded.

ATA held a Certification Exam Preparation Workshop on January 20 at the University of Massachusetts Boston. Training has always been an important part of ATA’s mission, and organizers wanted to see if a full-day workshop led by graders of ATA’s Certification Program could successfully benefit both exam candidates and the program.

The workshop consisted of two sessions designed to help participants understand how the exam is graded and the common errors candidates make. The morning session was for those interested in taking the exam from English into Spanish, while the afternoon session focused on those interested in taking the exam from any language into English. The two of us (Rudy and Diego) were in charge of the English>Spanish session (aside from grading, we work in the English>Spanish workgroup in ATA’s Certification Program). The other two graders, Bruce Popp and Andy Klatt (who work in the French>English and Spanish>English workgroups, respectively) led the into-English session.

Session I: Preparing for the English>Spanish Certification Exam

To develop and tailor this session, participants were mailed a sample practice test to translate and given about 10 days to complete and return it. These tests were then graded applying the same criteria used for the actual certification exam. The purpose of this exercise was to target each participant’s common—and not so common—errors. The results were then discussed during the session, although any specific examples used were kept anonymous.

The main benefit of this exercise for participants was that they were able to learn from comparing each other’s translations and discussing why one rendition worked and another didn’t. It allowed participants to gain a better understanding of where errors happen and identify if they are word-, sentence-, or passage-level errors. This analysis also allowed participants to see how errors impact the comprehension of the entire translated passage. There was plenty of back and forth discussion, including participants’ explanations of their choices and decisions. Each participant received his or her own marked-up practice test at the end of the workshop.

Session II: Preparing for the Into-English Certification Exam

Just like the morning session, the afternoon session began with an introductory talk with visual aids to provide a detailed explanation of the nature and expectations of the certification exam, the error categories and what they mean, and grading criteria and standards. Participants were introduced to the common criteria for grading into-English tests regardless of language pair. The Into-English Grading Standards (IEGS), which are available on ATA’s website, form an essential basis for grading all language pairs in which English is the target language.

The concept of evaluating errors based on the extent to which they detract from the usefulness of the translation to a potential client was also covered. The discussion then switched to some of the essential characteristics of an effective translation, the principles for exam preparation, and test-taking skills. After this, participants were divided into two groups.

Since a large proportion of the into-English group was composed of Spanish>English candidates who had taken the morning session, that group met separately to review the errors on the sample Spanish>English practice test that many of them had taken in preparation for the workshop. The second group was composed of candidates who work from a diverse set of languages into English. The presenters at this session were able to use materials that had been provided by several into-English certification workgroups to exemplify some of the challenges faced by candidates, including carrying over the linguistic organization of a text into a very different, sometimes unrelated, language. As was the case in the morning session, candidate participation was strong and enthusiastic.

A Favorable Response

The workshop proved to be a success, based not only on the number of attendees (the workshop sold out), but also on the diversity of the participants: people from as far away as the West Coast, Texas, Florida, and even Venezuela attended. With its maritime view, the University of Massachusetts Boston proved to be an attractive venue, even in winter. We were fortunate that the weather was cooperative that day, as Boston was experiencing a particularly rough winter. Many people signed up for both sessions, and while the content of the morning and afternoon sessions was different, they built upon each other.

Comments after both sessions were positive, as were most of the comments made in the post-event evaluations. As with any pilot program, some kinks need to be worked out. For example, one comment indicated that too much time had been spent on the administrative aspects of the testing and grading process, forcing presenters to rush through the more interesting part where passages were put under a magnifying glass and reviewed in detail.

As a direct result of the evaluation comments, we prepared a video that explains many of the generic details regarding the exam and presented it at a subsequent workshop that took place as part of the “Spring Into Action” conference co-sponsored by ATA’s Spanish Language Division, the Association of Translators and Interpreters of Florida, and Florida International University. In this way we were able to devote the entire workshop to analyzing the candidates’ proposed translations. The event in Miami was not part of ATA’s Certification Program, but the changes implemented for the workshop demonstrate that the Association and its graders respond to membership feedback to make its programs as rewarding, informative, and fun as possible.

ATA’s Certification Exam Preparation Workshop presented opportunities for participants to learn how the Certification Program works, including the general characteristics of the passages and how exams are evaluated and graded. In addition, participants were able to learn from the graders about the specific challenges found in exam passages and gain a better understanding of the common and individual mistakes that arise.


ATA’s Certification Exam: Introduction

ATA Practice Test: Benefits

Explanation of Error Categories

Flowchart for Error Grading

Framework for Standardized Error Marking

Into-English Grading Standards

Rudy Heller, an ATA-certified English>Spanish translator, has been a grader for ATA’s English>Spanish certification exam for over 12 years. He is a federally certified court interpreter and has been a professional translator for over 40 years. He is a former ATA director. Contact:

Diego Mansilla, an ATA-certified English>Spanish translator, is a grader for ATA’s English>Spanish certification exam. He is the director of the Translation Program at the University of Massachusetts Boston, where he also teaches advanced courses in translation. He is a member of the board of directors of the New England Translators Association. His areas of research are translation pedagogy, collaboration in translation, and online education and assessment. Contact:

Certification Exam Changes

Reblogged from The ATA Chronicle, with permission

There are major changes ahead for ATA’s certification exam in 2017.

Eligibility Requirements: Education and experience requirements needed to take the exam will be discontinued in January 2017. Why? Because they failed to predict the chances of an individual passing the exam. And that was the whole point—to ensure that exam candidates were not taking the exam before they were ready.

Note: An exam candidate still needs to be an ATA member in order to take the exam.

Exam Passages: All three exam passages will be general text in 2017. Why? Because people misunderstood labeling texts as medical, technical, or scientific text and legal, commercial, or financial. The intent of the exam has always been to certify translation competence as a whole, not competence based on a specialty.

Practice Tests: Practice tests will become available for download in the near future. Why? Because it’s crucial for exam candidates to know what they are walking into—not what they think, but what they know. The practice test is the best way to do that. Making it easier to take the practice test may encourage more people to do it.

Candidate Preparation Workshops: The Certification Committee is working to increase the availability of these workshops, as both live sessions and webinars. Why? Because they are another way for candidates to understand the exam and take a good look at whether they are ready for it.

Computerized Exam Option: More testing sites will offer computerized exam sittings next year. Why? Because now that the problem with exam security has been resolved, it makes sense to give exam candidates more of the tools they use in their translation work.

For more information on ATA’s Certification Program, please click here.

Image source: Pixabay


Computerized ATA Certification Exam Option Now Available at Select Sittings

 Reblogged from The ATA Chronicle with permission (incl. the image)

ATA is now offering a computerized option for taking the certification exam at select sittings. Candidates will now be able to take the exam on their own laptops.


  • May use most resources stored on their laptops, including dictionaries and glossaries.
  • May use non-interactive Internet resources, such as online dictionaries and other reference material.
  • May not use CAT tools or translation memories.
  • May not use e-mail, chat rooms, forums, or MT tools such as Google Translate.

This is to ensure that the work is the translator’s own and that the carefully vetted exam passages are not shared.

How Does the Computerized Exam Work?

Candidates input their translations using WordPad (or TextEdit for Mac) onto an ATA-supplied USB drive, with grammar and spell check utilities disabled.

Signed Statement Required

Candidates who opt for the computerized format must sign a statement acknowledging that certain activities are prohibited during the sitting (e.g., use of e-mail and chat, copying the exam passages) and that they understand the consequences of noncompliance.

Candidates who violate the rules applicable to computerized sittings are likely to face restrictions on future certification eligibility and could face ATA ethics violation proceedings.

Information about the statement candidates will sign and the consequences of rules violations is available from ATA’s Certification Program manager.

For a description of the exam format, please see the certification exam overview.

Handwritten Exam Available

Candidates can also choose to handwrite their exam. All candidates may continue to bring and use any print resources they wish.

Exam Schedule

Sittings continue to be scheduled primarily through ATA chapters and affiliates as well as through other local groups.

Groups and individuals interested in hosting a sitting should contact ATA’s Certification Program manager to inquire about the physical and technical requirements needed to host a computerized sitting.

Several computerized sittings will take place in 2017, including at ATA’s 58th Annual Conference. See the schedule of upcoming sittings for the status of future examination sittings.

ATA Certification Pass Rates 2003-2013, 2004-2014, and Statistical Trends

By Geoffrey S. Koby
Reblogged from The ATA Chronicle with permission from the author (incl. the image)

ATA CertificationThe Certification Committee is happy to report here on certification pass rates for 2003-2013 and 2004-2014. The average certification pass rates for these two sets of data have remained relatively stable, although other factors in ATA’s Certification Program have changed somewhat in the past two data sets. The four sets of 11-year data that have been published in The ATA Chronicle to date (2001-2011; 2002-2012; 2003-2013; and 2004-2014) now allow for some interesting comparisons and analyses.

To describe the results effectively and avoid distortion, the information has been divided into two groups: 1) languages with 40 or more exams in the reporting period; and 2) languages with extremely low volume (ELV), defined as language pairs with fewer than 40 exams in the reporting period. In the following, we report summary statistics for the entire set of exams for 2003-2013 and 2004-2014, broken down by these two groups.

For 2003-2013, the overall pass rate was 14.47%. A total of 6,339 candidates (previous period: 7,033) took the exam in 29 language pairs (previous period: 29), and 917 exams were rated “pass” (previous period: 1,032). Of these language pairs, 16 had 40 or more exams over this period (previous period: 18). The Polish>English and Dutch>English exams have entered ELV status due to low demand for these language pairs, while Finnish­>English is no longer represented. However, Swedish>English has started as a new language pair.

For 2004-2014, the overall pass rate was 15.45%. A total of 5,463 candidates (previous period: 6,339) took the exam in 29 language pairs (previous period: 29), and 844 examinations were rated “pass” (previous period: 1,032). Of these language pairs, 16 had 40 or more exams over this period (previous period: 16). The individual language pairs are listed in Table 1 in alphabetical order with the number of exams and the individual pass rates per language pair for both sets of Table 1 Cert

In both data sets, 13 of the 29 language pairs had fewer than 40 exams. Table 2 shows the combined results for these language pairs. The data is presented this way because these language pairs cannot be averaged reliably due to their low volume. Another reason is that exams in some languages were not offered for the entire period. The Italian>English language pair was suspended in 2007 and was only reinstated in 2015, so it will remain in the ELV category for some time. In addition, Hungarian>English, which had a low volume to begin with, has been suspended since 2008, although work is ongoing to reinstate it.

Table 2 Cert

Figures 1 and 2 show the information on the two data sets in graphical form, in a format slightly different from previous pass-rate reports. The dashed horizontal red line shows the mean pass rate. No standard deviation is provided for the pass rate percentages because the language pairs have widely divergent numbers of exams. Overall, this figure shows that the pass rates differ for each language pair.

Figure 1 shows the pass rates for 2003-2013. The pass rates for the high-volume pairs range from 8.42% for English>French to 28.42% for English>Portuguese. The ELV languages have an aggregated average pass rate of 34.15% (3.23% of all exams), which represent 13 language groups averaging two or fewer exams per year.

Figure 1

Certification Forum Revised LONG.Figure 1

Figure 2 shows the pass rates for 2004-2014. The pass rates for the high-volume pairs range from 9.00% for Arabic>English to 28.97% for English>Portuguese. The ELV languages have an aggregated average pass rate of 35.88% (3.11% of all exams), which represent 13 language groups averaging two or fewer exams per year. A slightly higher or lower number of ELV exams passing in any data set can greatly skew the individual average.

Figure 2

Certification Forum Revised LONG.Figure 2

With four data sets with which to work, it is now possible to show some trends. Figure 3 shows that the number of exams has been declining over time, from 7,585 exams in 2001-2011 to 5,463 exams in 2004-2014. This is not surprising, as the number of candidates for the exam declined in 2002 with the implementation of eligibility requirements. The number of ELV exams has remained small but relatively stable, with just under 100 exams per data set into English and just over 80% into foreign. At the same time, the number of high-volume exams has declined 28% overall, with exams into foreign languages declining 27% and exams into English declining 30%.

Figure 3

Certification Forum Revised LONG.Figure 3

Figure 4 compares pass rates over time, using four data sets (2001-2011 through 2004-2014). The overall pass rate has remained largely stable, with a high of 15.64% and a low of 14.67%. The pass rate for high-volume languages closely mirrors the overall pass rate, just slightly below it, ranging from 15.16% to 13.81%. Not surprisingly, the ELV pass rate is quite a bit higher and more variable. The shift between the low 40% range in the first two data sets and the mid-30% range in the second two sets is attributable to a couple of language pairs with moderate pass rates moving from high-volume into the ELV category, pulling the average down. This did not have a noticeable effect on the high-volume pass rate, however, which shows how small the number of ELV exams is in the overall system.

Figure 4

Certification Forum Revised LONG.Figure 4

It is now also possible to compare average pass rates over the four data sets for each language pair individually. (See Table 3 and Figure 5.) Table 3 shows the pass rates for each language pair over time, sorted by the pass rate (low to high), while Figure 5 is sorted and grouped by language for easier comparison. The standard deviation provided shows that the pass rate in each language pair has remained relatively stable over time.1 Even those language pairs with the largest fluctuations (English>Russian and English>German) have remained within a relatively narrow range over the four data sets (15.25%-21.89% and 22.30%-28.21%, respectively).

Table 3 Cert

Figure 5

Certification Forum Revised LONG.Figure 5

The stability of these pass rates indicates that, although we can calculate an overall average pass rate for each data set, the more realistic figures are the individual average pass rates over time in each language pair. This also makes sense because, although all ATA exam passages are selected, administered, and graded according to the same criteria and all ATA graders are trained in the same methodology, each language pair must be considered a separate test. This is because the populations taking the tests are composed of completely different individuals (except for a very small number of individuals who test in two languages). In addition, the language training background and linguistic-cultural contexts for candidates in each language pair vary widely. This is particularly apparent in Figure 5, where it is possible to compare pass rates where ATA offers its certification exam in both directions.

The differences in pass rates between language directions vary from a low of 0.97% for the language pairs involving Spanish to a high of 10.46% for those involving Polish. In most but not all pairings, the exam into the foreign language has a higher pass rate. Given the relatively less extensive nature and scope of foreign-language learning in the U.S., we might speculate that for many language pairs, the population taking the test into the foreign language would include large percentages of native speakers of that language, while the population taking the test into English may include both native speakers of English who learned the language and are fluent foreign speakers of English trained in other cultures. However, given the data we have, it is impossible to arrive at any conclusions as to why pass rates differ.

We hope this detailed information on pass rates is interesting and useful to our members and potential candidates for the certification exam. The Certification Committee will continue to report the figures on a regular basis.


  1. Please note that the Polish>English and Dutch>English pass rates are based on only two data sets. This is because these language pairs entered ELV status in the 2003-2013 data set due to low demand for exams in these languages.

Geoffrey S. KobyGeoffrey S. Koby is an ATA director and the immediate past chair of ATA’s Certification Committee. He is an associate professor of German/translation studies at Kent State University. Formerly the coordinator of the university’s BS in translation program and assistant to the chair, he teaches undergraduate and graduate courses in translation theory and praytice. An ATA-certified German>English and Dutch>English translator, his professional practice focuses on business, legal, and financial translation. Contact:

Study resources for translation certification

Study resources for translation certificationOur team leader Helen has been a busy bee compiling a list of resources to help translators interested in taking the ATA certification exam. Even if you are not seeking certification, we felt there are many useful resources here we would like to share with you—from exam guidelines & translation tips to English & Spanish language, technology and copyediting resources. Use them to hone your craft and please let us know if you found them useful.

This list was reblogged with permission from Gaucha Translations blog.

From the ATA Certification program

From the WA DSHS Certification program

ATA Computerized exam

What is translation?

Articles on how to approach translation

English resources

Bilingual references

  • Word Reference
  • Linguee
  • Word Magic
  • Google Translate and Proz are not approved resources for the ATA computerized exam. No interactive resource (where you can ask a live question on a forum) is approved. The resources listed above are OK.
  • Click here to see the official ATA guidelines for computerized exams.

Plain Language

English copy editing training

Canada copy editing (includes certification)

Medical copy editing (AMWA has a certification program)

Resources from other translation certification programs

Copy editing tools to produce clean documents

Other training on translation, technology and other

Readers, would you add anything to this list of resources? Have you used any of these resources and found them useful?

Header image credit: tookapic