Know Your Enemy Have you ever heard or read stories about CCIE candidates claiming that their failure at taking the test is not their fault?

If you’ve started your journey toward earning your CCIE number, I’ll bet you’ve heard or read these and, in turn, I’ll bet that these are important concerns to you as you prepare for the exam.

 

This is, of course, legitimate and I’d like to take a moment to discuss some considerations that might put these stories into perspective. While there are some instances where test failures are indeed not candidate’s responsibility, in most cases, failure actually comes down to ineffective or nonexistent test taking strategies. As discussed in the previous articles “The Expert’s Mindset”, time management and attention to the details are the two most vicious and implicit metrics of the exam…

 

Let's assume that you have extensively and properly prepared for the exam and that you’re feeling confident and proficient with most technical topics in the exam curriculum. You know what you don’t know and you know where to quickly find reliable information about it. You are able to effectively and efficiently perform technical tasks at the expected performance level in terms of time, amount of work and technical accuracy.

 

One could think that this is all that it takes to pass the exam and therefore failure must be due to wording ambiguity, the test administrator’s lack of support, the test environment (workstation, keyboard, mouse, software tools, interface, devices, etc), the room temperature or background noise, the uncomfortable chair, or any other condition that is not in the candidate’s control.

 

Again, I'm not saying that these conditions never happened, but well, how about non-technical skills such as exam-taking skills, stress, time and risk management, attention to the detail, etc?

If you want to pass the exam, it might be worth spending some time to consciously build an effective strategy that accounts for all of these non-technical aspects.

 

Just like for any project, you will want to include all variables into the planning, not just the technical angle, right?

 

What Is an Exam?

The most fundamental considerations in developing and evaluating any test are its validity, reliability and fairness.

 

For instance, if a test were intended to endorse an archer's ability for hitting a target, the test would need to consistently, reliably and fairly measure the archer's performance. So if the archer's arrows are consistently hitting the center of the target, in identical test conditions (similar distance to the target, bow's, arrow's and target's attributes, environment, weather, etc), then we have minimized bias, such as the archer's training or culture or geographic region, age, gender, etc.

 

The next fundamental consideration in test development is the intended use of the test score. Is the test intended to rank candidates and select only the best 10 percent, like a college entrance test? Or is the test intended to discriminate between qualified and non-qualified candidates, regardless of how many candidates already passed it?

 

In this archer’s test example, the intended use of the test could be to grant an expert-level certificate to any archer who is able to reach the center of the target at a minimum rate of – say 80 percent – of arrows fired (among other tasks…).

 

So if the intention is to make a pass or fail decision, then the test needs to define what is the minimum performance required of archers to pass the test in order to be recognized as an expert-level archer. There are multiple ways to objectively define this minimum performance level but it’s not really the purpose of this article so let’s focus back on our CCIE R&S v5.0.

 

Exam Design Attributes

As you know, the intended use of the CCIE test is to discriminate between qualified and non-qualified engineers, regardless of how many candidates take or pass the test. Anyone who is able to demonstrate that his/her performances meet or exceed the test’s cut-score will receive a CCIE number, regardless of how close or how far his/her score is compared to the cut-score. There is no concept of better or worse CCIE. This is why the score report doesn’t mention anything other than “passed” when the candidate’s score meets or exceeds the cut-score. Someone who aces the exam with 100 percent receives the exact same credential as someone else who passes the exam on the borderline!

Isn't this a fundamental consideration when building a strategy to succeed on the exam?

And it's not the only one…

 

Let's see what other design attributes are publicly disclosed and which ones could be useful in building that effective strategy.

 

The CCIE R&S v5.0 exam is composed of four independent test modules: the written exam (WR), and the three lab exam modules: troubleshooting (TS), diagnostic (DIAG) and configuration (CFG). Each one of these test modules has its own set of design attributes and rather than discussing each one in detail, the following table provides a clear comparison between all four modules.

 

ExamModulesDesignAttributes.png

 

Let's clarify each attribute to avoid any ambiguity.

 

Intended use of the test result: As discussed above, this is the final objective of the exam module. All four modules are “pass/fail” exams, with the additional constraints in the lab exam. I would suggest that a candidate should target passing the exam and avoid trying to get a better score than anyone else or even ace the exam! If needed, put the ego aside and aim at passing the exam, not acing it!

 

Constraints in scoring logic: This refers to the additional rules applied to the scoring calculation. Both the written and lab exams define a cut-score (that is, a passing score) while only the lab exam requires meeting the additional restriction of the minimum score. Not only does the total score need to meet or exceed the “lab-level” cut-score, but also all three modules’ score must meet or exceed their respective “module-level” minimum score. So a candidate may compensate for weakness in a module, like TS, with strength in CFG. If the total score (sum of all three modules’ score) meets or exceeds the “lab-level” cut-score and if each module’s score meets or exceeds its “module-level” minimum-score, then it is a pass. If any module’s score doesn’t meet its minimum-score, then the whole exam is failed, regardless of the score in other modules. This is super important.

 

All items are visible at the start of the exam: The written exam doesn’t allow going back and forth between questions (items). It is only possible to move forward, not backward. Therefore, a candidate will not see all items when starting the written exam. On the other hand, all items are available when starting any lab module. So a candidate can cherry pick items or decide on which sequence to address items. This is another important attribute that is frequently overlooked.

 

Item’s score value is visible during the exam: Knowing beforehand which item has more or less ‘point value’ compared to others is also important when deciding how to approach the exam and which item to address first. Even though an item’s point doesn’t always indicate difficulty, it will most likely help to prioritize items relative to each other.

 

Item’s scoring opportunities: All items in the CCIE R&S v5.0 exam modules are “dichotomous”: their possible score outcome is either everything (the item’s score value) or nothing (zero points). There are no “polytomous” items, which would allow scoring only a portion of the item’s score value. So in order to get the item’s score value, a candidate must meet all requirements of the item. Meeting only a subset of the requirements doesn’t get you a portion of the score value. There is no partial grading in any exam module of CCIE R&S v5.0.

 

Access to devices’ console during the exam: Only the TS and CFG modules offer remote console access to “living” devices. The written exam and DIAG provide documentation only (pictures, text or eventually tools such as traffic analyzer).

 

Number of topologies used throughout the exam: Since the written exam and DIAG do not rely on “living” devices, each item may use a separate network topology. Both TS and CFG are based on a single scenario that relies on a single topology that is used throughout the exam module.

 

Items inter-dependence (resolution of item X depends on resolution of item Y): Only CFG’s items are inter-dependent. Layer 3 builds on top of Layer 2, so if your Layer 2 is broken, there is a small chance that Layer 3 will work as expected. This is the nature of the technologies that we’re dealing with and testing these inter-dependencies is intentional when designing CFG’s scenarios and item requirements. But the items of all three other exam modules are independent of each other. So the resolution of TS item number 3 doesn’t rely on the resolution of item number 2!

 

Possible solution(s): All items on the written exam and the DIAG module have only one possible solution; therefore their scoring is deterministic and the automated script is simple. On the other hand, TS and CFG items are expecting specific outcomes that – most of the time – can be achieved using many different approaches. The challenge is finding an efficient solution that requires as few command-line interactions as possible and not violate any guideline or additional constraint.

 

Time limit: I can’t think of any performance measurement that doesn’t involve time. So it is natural that performance tests have a time limit. However, with the pressure of the exam itself, CCIE R&S v5.0 introduces some flexibility in the lab exam. While TS is designed to be doable within two hours, the system allows candidates to spend up to two and a half hours working on TS items. Since the total exam time is still limited to eight hours, this optional additional time is then automatically deducted from the time credit allotted to CFG. It is the candidate’s choice whether to borrow up to thirty minutes from CFG and use it in TS. This is a strategic decision and candidates should consider carefully whether or not to use that option before taking the exam. DIAG’s time limit is fixed to thirty minutes, no more and no less. So there’s no flexibility or strategic decision there.

 

Please keep in mind that all these design attributes are subject to change and that the above table is applicable for the current CCIE R&S v5.0, as of August 2014.

 

In the next article, I'll attempt to compile some concrete recommendations and examples of efficient strategies that take into account these exam design attributes.

 

I hope this is useful. As usual, please feel free to comment below.

Namaste.