Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code Coverage #33

Open
jeffrose opened this issue Apr 22, 2021 · 1 comment
Open

Code Coverage #33

jeffrose opened this issue Apr 22, 2021 · 1 comment

Comments

@jeffrose
Copy link

Is there a development pattern for getting code coverage when using karma-esbuild? I've previously used karma-coverage with karma-webpack but that isn't working with karma-esbuild. I noticed the definition of a COVERAGE global in your advanced example but wasn't sure what library that could be referencing.

@eric-hemasystems
Copy link

I don't have a great solution but I have something that is better than nothing.

TL;DR

For those that just want some sort of coverage information even if it's a hack. Add the following as an esbuild plugin:

const { Buffer } = require('buffer')
const { createInstrumenter } = require('istanbul-lib-instrument')

const coveragePlugin = {
  name: 'code-coverage',
  setup(build) {
    coverageInstrumenter = createInstrumenter({ esModules: true })

    build.onEnd((result) => {
      const js = result.outputFiles.find(f => f.path.match(/\.js$/))
      const sourceMap = result.outputFiles.find(f => f.path.match(/\.js\.map$/))
      const sourceMapObject = JSON.parse(sourceMap.text)
      sourceMapObject.sourceRoot = '/'

      const instrumented = coverageInstrumenter.instrumentSync(js.text, null, sourceMapObject)
      js.contents = Buffer.from(instrumented)
    })
  }
}

You'll need to add istanbul-lib-instrument to your package.json (dev dependency is fine)

I do this conditionally only in Karma and only if a COVERAGE env variable is set. We obviously don't want code coverage calculated in our production code and even most of the time the test suite is run we don't want to spend the time to gather the stats.

Then continue to use karma-coverage to actually generate the report. You don't need to use its preprocessor just the report.

This will get you a coverage report that is not great but at least something you can work with. It will include your test file as well as your node_modules files which will drive down your coverage percent.

To determine the correct percent we are just going to run some JS within the report screen to only count the lines we are interested in. For me all the files that are runtime code have app/javascript in them. Therefore I run the following in the JS console:

Array.from(document.querySelectorAll('tbody .file')).reduce(((memo, element) => {
  if( !element.textContent.match(/app\/javascript/) ) return memo

  const data = element.closest('tr').querySelectorAll('td')[3].textContent.split('/', 2)
  memo.covered += Number.parseInt(data[0])
  memo.total += Number.parseInt(data[1])
  return memo
}), {covered: 0, total: 0})

This results in the return value of:

{ covered: 3141, total: 3365 }

Some quick math and I know that is a bit over 93% coverage.

The Details

For those interested in moving the ball forward writing up what I have learned in hopes it is helpful in getting a proper solution.

Using Istanbul as our code coverage tool (which is what karma-coverage is using), there are really two phases to generating a code coverage report:

  1. Instrumenting the code to gather the statistics
  2. Use those statistics to generate the report in the desired format

Let's look at step 2 first. If you add coverage as a karma report then karma-coverage will look for a variable in the test page called __coverage__ which contains all the statistics gathered by the instrumentation. This means as long as we can populate __coverage__ data then the existing functionality in karma-coverage should work just great.

Step 1 is where we have a problem. Instrumenting the code with istanbul is what will populate that variable. There are two ways we can instrument the code and neither work with karma-esbuild.

  1. The first way is what karma-coverage provides which is a preprocessor. The idea is you let karma actually load up all the raw sources to load into the page. When karma is doing that it can preprocess your source files with the necessary instrumentation. But this is really incompatible with our goals as the point of karma-esbuild is to let esbuild bundle our sources (doing possible transformations) and that result is injected into the test page.
  2. The other option is what I used to do when using webpack. This is to use babel-plugin-istanbul. This allows the JS to get instrumentation as it's being transformed by babel. I used to have this plugin only activate when in the test env.

babel-plugin-istanbul doesn't really meet our needs for two reasons:

  1. We may not be using babel. Although there is a babel plugin for esbuild, even the author of that plugin encourages using esbuild's transformations if they are sufficient since they are faster.
  2. Even if I only conditionally added the esbuild-babel plugin for testing just so I can use babel-plugin-istanbul, it only provides coverage on raw js files. Files that are transformed to js content (such as Vue components) didn't get any coverage info. They did when I used webpack.

I'm not completely clear on the reason for that but I think relates to the comment in the plugin API limitations section of the documentation. It says:

One way to think about esbuild is as a "linker" for the web. Just like a linker for native code, esbuild's job is to take a set of files, resolve and bind references between them, and generate a single file containing all of the code linked together. A plugin's job is to generate the individual files that end up being linked.

While I think webpacker and other bundlers like rollup are more like a shell pipe where the content output of one plugin can then feed into the next one allowsing a plugin that transforms JS (like babel) to process Vue files by having a Vue plugin convert to JS and then feed that to babel. I could be wrong on all this.

Therefore my solution is to use the onEnd hook to modify the entire bundled JS file. I provide istanbul that bundled JS as well as the sourcemap allowing it to map back to the original file (including transformed files like Vue code).

I did find I had to modify that sourcemap a bit. All the sources in the sourcemap had the name Users/me/projects/my-project/app/javascripts/....). I.E. it was the full filesystem path to each file although it was missing the starting /. This made it relative so the report ended up looking for /Users/me/projects/my-project/Users/me/projects/my-project/app/javacripts/..... The report couldn't find those files obviously so while I got coverage stats I couldn't actually view the files.

To correct this I provided a source root of / to make that relative path work:

sourceMapObject.sourceRoot = '/'

All this got me a working report although as I mentioned in the TL;DR section it included files I didn't want to include in the report. I'm not sure how to either tell instanbul not to instrument those parts of the bundle (based on the sourcemap paths) or tell the report to skip outputting those paths. Therefore I wrote a quick bit of JS that extracts the stats I need for the relevant files.

Solutions

What is a real solution to all this?

Maybe allow custom transformations to be defined for the JS loader. esbuild has built-in transformations. If there was a hook where I could provide my own custom transformation. I don't need to be necessarily given the AST of the JS. It might be sufficient to be just given the JS and I return modified JS, leaving the parsing of the JS up to the plugin (and in this case istanbul provides that parsing and modification of the source).

Another solution might be to modify istanbul to be able to include/exclude paths from instrumentation. But the key thing is it would need to abide by input sourcemap. This would allow me to drop the JS I need to run on the report directly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants