diff --git a/LICENSE b/LICENSE new file mode 100644 index 0000000..261eeb9 --- /dev/null +++ b/LICENSE @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. diff --git a/README.md b/README.md new file mode 100644 index 0000000..dd3a6c3 --- /dev/null +++ b/README.md @@ -0,0 +1,112 @@ +# Go Coverage Plus +Optimise the coverage report in go text format with source code. +Also, can write cobertura format. Including complexity metrics and branch coverage. + +## Installation +Simple install it by `go install`: +``` +go install https://github.com/Fabianexe/gocoverageplus@latest +``` + +## Usage +`gocoverageplus` can be run without any arguments (fallbacking to defaults). +However, it needs a config file in json format (See config section). +You can further specify the config path, a coverage input path, and an output path. + +### Flags +* `-h` or `--help` to get a help message +* `-c` or `--config` to specify the config file (default is `.cov.json`) +* `-o` or `--output` to specify the output file (default is `coverage.xml`) +* `-i` or `--input` to specify the input file (default is `coverage.cov"`) +* `-v` or `--verbose` to get more output. Can be used multiple times to increase the verbosity + +### Config +The config file is a json file with the following structure: +```json +{ + "OutputFormat": "cobertura", + "SourcePath": "./", + "Cleaner": { + "ErrorIf": true, + "NoneCodeLines": true, + "Generated": true, + "CustomIf": [ + "debug" + ] + }, + "Complexity": { + "Active": true, + "Type": "cognitive" + } +} +``` +As output formats, you can choose between `cobertura` and `textfmt`. The first is described in the cobertura section and +the second is the default go coverage format. +Complexity only apply for the `cobertura` format. The complexity type can be either `cognitive` or `cyclomatic`. +The difference between these metrics is described in the cobertura section. + +## The accuracy of `go test -coverprofile` +The `go test -coverprofile` command is a great tool to get coverage information about your project. +However, it measures the coverage on a bock level. This means that if you function contains empty lines, only comments, +or lines with only a closing bracket, they will be counted in line metrics. + +This project tries to solve this problem by using the `go/ast` package to determine the actual lines of code from the source. + +Another result from this is that branches on a line level can be determined. If a line contains an `if` statement, +with multiple conditions, it is still one block for the coverage profile. There are projects that try to solve this problem +for example [gobco](https://github.com/rillig/gobco). However, they for the moment not compatible with the Jenkins coverage plugin. +Thus, we add branch coverage on method and file level. Where such multi condition statements are counted as one branche. + +## Source code Filter +There are parts of the source code that may not be included in the coverage report. +At the moment, the following parts can be excluded: +* Generated files + * Files that fellows [this convention](https://go.dev/s/generatedcode) are excluded +* None code lines + * Empty lines + * Lines that only contain a comment + * Lines that only contain a closing bracket +* Error ifs + * If statements that only contain an error check (`if err != nil`) with only a return in the body are excluded +* Custom ifs + * If statements that only contain one bool with a name given in the config list (`if debug`) with no else part are excluded + +You can activate these filters by using the corresponding config values. + +# Cobertura Format +The cobertura format is a widely used format for coverage reports. It is supported by many tools like Jenkins. +It is an XML format that contains the coverage information for each file and package. +Besides the coverage information, it also contains the complexity metrics for each function. +The format is described [here](https://github.com/cobertura/cobertura/blob/master/cobertura/src/site/htdocs/xml/coverage-04.dtd). +## Cyclomatic Complexity vs Cognitive Complexity + +Cyclomatic Complexity and Cognitive Complexity are both software metrics used to measure the complexity of a program. They are used to determine the quality of code and identify areas that might need refactoring. However, they approach the measurement of complexity from different perspectives. + +### Cyclomatic Complexity + +Cyclomatic Complexity, introduced by Thomas McCabe in 1976, is a quantitative measure of the number of linearly independent paths through a program's source code. It is computed using the control flow graph of the program. The cyclomatic complexity of a section of source code is the count of the number of linearly independent paths through the source code. It is computed as: + +``` +Cyclomatic Complexity = Edges - Nodes + 2*Connected Components +``` + +Cyclomatic Complexity is primarily used to evaluate the complexity and understandability of a program, and it can also give an idea of the number of test cases needed to achieve full branch coverage. + +### Cognitive Complexity + +Cognitive Complexity, introduced by SonarSource, is a measure that focuses on how difficult the code is to understand by a human reader. It considers things like the level of nesting, the number of break or continue statements, the number of conditions in a decision point, and the use of language structures that unnecessarily increase complexity. + +Cognitive Complexity aims to produce a measurement that will correlate more closely with a developer's experience of a code base, making it easier to identify problematic areas of code that need refactoring. + +### Summary + +In summary, while Cyclomatic Complexity is a measure of the structural complexity of a program, Cognitive Complexity is a measure of how difficult a program is to understand by a human reader. +Both are useful, but they serve different purposes and can lead to different conclusions about the code's quality. + +## Others +So far we are aware about two other projects that do something similar: +* [gocov-xml](https://github.com/AlekSi/gocov-xml) +* [gocover-cobertura](https://github.com/boumenot/gocover-cobertura) + +However, both of them focus on the coverage part and take over a big downsides of the `go test -coverprofile` command. +Further this project adds complexity metrics, more options to determine coverage, and branch coverage. \ No newline at end of file diff --git a/go.mod b/go.mod new file mode 100644 index 0000000..6badbbf --- /dev/null +++ b/go.mod @@ -0,0 +1,17 @@ +module github.com/Fabianexe/gocoverageplus + +go 1.22.0 + +toolchain go1.23.0 + +require ( + github.com/spf13/cobra v1.8.1 + github.com/spf13/pflag v1.0.5 + golang.org/x/tools v0.26.0 +) + +require ( + github.com/inconshreveable/mousetrap v1.1.0 // indirect + golang.org/x/mod v0.21.0 // indirect + golang.org/x/sync v0.8.0 // indirect +) diff --git a/go.sum b/go.sum new file mode 100644 index 0000000..411813b --- /dev/null +++ b/go.sum @@ -0,0 +1,16 @@ +github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o= +github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8= +github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw= +github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM= +github.com/spf13/cobra v1.8.1 h1:e5/vxKd/rZsfSJMUX1agtjeTDf+qv1/JdBF8gg5k9ZM= +github.com/spf13/cobra v1.8.1/go.mod h1:wHxEcudfqmLYa8iTfL+OuZPbBZkmvliBWKIezN3kD9Y= +github.com/spf13/pflag v1.0.5 h1:iy+VFUOCP1a+8yFto/drg2CJ5u0yRoB7fZw3DKv/JXA= +github.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg= +golang.org/x/mod v0.21.0 h1:vvrHzRwRfVKSiLrG+d4FMl/Qi4ukBCE6kZlTUkDYRT0= +golang.org/x/mod v0.21.0/go.mod h1:6SkKJ3Xj0I0BrPOZoBy3bdMptDDU9oJrpohJ3eWZ1fY= +golang.org/x/sync v0.8.0 h1:3NFvSEYkUoMifnESzZl15y791HH1qU2xm6eCJU5ZPXQ= +golang.org/x/sync v0.8.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk= +golang.org/x/tools v0.26.0 h1:v/60pFQmzmT9ExmjDv2gGIfi3OqfKoEP6I5+umXlbnQ= +golang.org/x/tools v0.26.0/go.mod h1:TPVVj70c7JJ3WCazhD8OdXcZg/og+b9+tH/KxylGwH0= +gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= +gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= diff --git a/main.go b/main.go new file mode 100644 index 0000000..0173d76 --- /dev/null +++ b/main.go @@ -0,0 +1,9 @@ +package main + +import ( + "github.com/Fabianexe/gocoverageplus/pkg/commands" +) + +func main() { + commands.RootCommand() +} diff --git a/pkg/cleaner/cleaner.go b/pkg/cleaner/cleaner.go new file mode 100644 index 0000000..e0ea8da --- /dev/null +++ b/pkg/cleaner/cleaner.go @@ -0,0 +1,32 @@ +// Package cleaner cleans the coverage data and discard lines, classes and packages that are not relevant +package cleaner + +import ( + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +// CleanData cleans the package data +func CleanData( + project *entity.Project, + cGeneratedFiles bool, + cNoneCodeLines bool, + cErrorIf bool, + customIfNames []string, +) *entity.Project { + if cGeneratedFiles { + project = cleanGeneratedFiles(project) + } + + if cNoneCodeLines { + project = cleanNoneCodeLines(project) + } + + if cErrorIf { + project = cleanErrorIf(project) + } + if len(customIfNames) > 0 { + project = cleanCustomIf(project, customIfNames) + } + + return project +} diff --git a/pkg/cleaner/custom_if.go b/pkg/cleaner/custom_if.go new file mode 100644 index 0000000..edd075f --- /dev/null +++ b/pkg/cleaner/custom_if.go @@ -0,0 +1,95 @@ +package cleaner + +import ( + "go/ast" + "go/token" + "log/slog" + "slices" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +// cleanErrorIf removes all error if statements from the package data +// An error if statement is an if statement that checks if a variable x is not nil, x is named err or is of type error and has only a return statement in the body. +func cleanCustomIf(project *entity.Project, names []string) *entity.Project { + slog.Info("Clean error if statements") + for _, p := range project.Packages { + for _, f := range p.Files { + var countCustomIf int + for _, method := range f.Methods { + vis := &cleanCustomIfVisitor{ + fset: p.Fset, + names: names, + } + + ast.Walk(vis, method.Body) + + countCustomIf += len(vis.customIF) + + for _, errIf := range vis.customIF { + method.Tree.AddBlock( + &entity.Block{ + StartPosition: errIf.start, + EndPosition: errIf.end, + DefPosition: errIf.start, + Type: entity.TypeBlock, + Ignore: true, + }, + ) + } + } + if countCustomIf > 0 { + slog.Debug("Cleaned custom if statements", "File", f.FilePath, "ErrorIfs", countCustomIf) + } + } + } + + return project +} + +type cleanCustomIfVisitor struct { + customIF []customIF + fset *token.FileSet + names []string +} + +type customIF struct { + start token.Position + end token.Position +} + +func (c *cleanCustomIfVisitor) Visit(node ast.Node) (w ast.Visitor) { + if !checkCustomIf(node, c.names) { + return c + } + + c.customIF = append(c.customIF, customIF{ + start: c.fset.Position(node.Pos()), + end: c.fset.Position(node.End()), + }) + + return c +} + +func checkCustomIf(node ast.Node, names []string) bool { + if node == nil { // nothing here + return false + } + + v, ok := node.(*ast.IfStmt) + if !ok { // no if statement + return false + } + + if v.Cond == nil || // no condition + v.Else != nil { // has else + return false + } + + cond, ok := v.Cond.(*ast.Ident) + if !ok { // no binary expression + return false + } + + return slices.Contains(names, cond.Name) +} diff --git a/pkg/cleaner/err_if.go b/pkg/cleaner/err_if.go new file mode 100644 index 0000000..cfd2cd8 --- /dev/null +++ b/pkg/cleaner/err_if.go @@ -0,0 +1,126 @@ +package cleaner + +import ( + "go/ast" + "go/token" + "log/slog" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +// cleanErrorIf removes all error if statements from the package data +// An error if statement is an if statement that checks if a variable x is not nil, x is named err or is of type error and has only a return statement in the body. +func cleanErrorIf(project *entity.Project) *entity.Project { + slog.Info("Clean error if statements") + for _, p := range project.Packages { + for _, f := range p.Files { + var countErrorIf int + for _, method := range f.Methods { + cleanErrorIfVisitor := &cleanErrorIfVisitor{ + fset: p.Fset, + } + + ast.Walk(cleanErrorIfVisitor, method.Body) + + countErrorIf += len(cleanErrorIfVisitor.errorIfs) + + for _, errIf := range cleanErrorIfVisitor.errorIfs { + method.Tree.AddBlock( + &entity.Block{ + StartPosition: errIf.start, + EndPosition: errIf.end, + DefPosition: errIf.start, + Type: entity.TypeBlock, + Ignore: true, + }, + ) + } + } + if countErrorIf > 0 { + slog.Debug("Cleaned error if statements", "File", f.FilePath, "ErrorIfs", countErrorIf) + } + } + } + + return project +} + +type cleanErrorIfVisitor struct { + errorIfs []errorIF + fset *token.FileSet +} + +type errorIF struct { + start token.Position + end token.Position +} + +func (c *cleanErrorIfVisitor) Visit(node ast.Node) (w ast.Visitor) { + if !isErrorIf(node) { + return c + } + + c.errorIfs = append(c.errorIfs, errorIF{ + start: c.fset.Position(node.Pos()), + end: c.fset.Position(node.End()), + }) + + return c +} + +func isErrorIf(node ast.Node) bool { + if node == nil { // nothing here + return false + } + + v, ok := node.(*ast.IfStmt) + if !ok { // no if statement + return false + } + + if v.Cond == nil || // np condition + v.Else != nil || // has else + len(v.Body.List) != 1 { // more than one statement in Body + return false + } + + cond, ok := v.Cond.(*ast.BinaryExpr) + if !ok { // no binary expression + return false + } + + if cond.Op != token.NEQ { // not != + return false + } + + if compare, ok := cond.Y.(*ast.Ident); !ok || compare.Name != "nil" { // not compared against nil + return false + } + + if _, ok := v.Body.List[0].(*ast.ReturnStmt); !ok { // body is not a return statement + return false + } + + return isErrorVar(cond) +} + +func isErrorVar(cond *ast.BinaryExpr) bool { + object, ok := cond.X.(*ast.Ident) + if !ok || object.Obj == nil || object.Obj.Kind != ast.Var { // not a variable + return false + } + + if object.Name != "err" { // not named err + // try to determine type of object + typ, ok := object.Obj.Decl.(*ast.ValueSpec) + if !ok { // not a value spec + return false + } + + if n, ok := typ.Type.(*ast.Ident); !ok || n.Name != "error" { // not an error + return false + } + } + + return true +} diff --git a/pkg/cleaner/generated.go b/pkg/cleaner/generated.go new file mode 100644 index 0000000..47c7dd5 --- /dev/null +++ b/pkg/cleaner/generated.go @@ -0,0 +1,26 @@ +package cleaner + +import ( + "go/ast" + "log/slog" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +func cleanGeneratedFiles(project *entity.Project) *entity.Project { + slog.Info("Clean generated files") + for _, p := range project.Packages { + i := 0 + for i < len(p.Files) { + f := p.Files[i] + if ast.IsGenerated(f.Ast) { + slog.Debug("Remove generated file", "File", f.FilePath) + p.Files = append(p.Files[:i], p.Files[i+1:]...) + continue + } + i++ + } + } + + return project +} diff --git a/pkg/cleaner/none_code.go b/pkg/cleaner/none_code.go new file mode 100644 index 0000000..984b420 --- /dev/null +++ b/pkg/cleaner/none_code.go @@ -0,0 +1,64 @@ +package cleaner + +import ( + "go/ast" + "go/token" + "log/slog" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +func cleanNoneCodeLines(project *entity.Project) *entity.Project { + slog.Info("Clean none code lines") + for _, p := range project.Packages { + for _, f := range p.Files { + var cleanedLines int + for _, method := range f.Methods { + noneCodeVisitor := &noneCodeVisitor{ + validLines: make(map[int]struct{}, 128), + fset: p.Fset, + } + + ast.Walk(noneCodeVisitor, method.Body) + + for line := method.Tree.StartPosition.Line; line < method.Tree.EndPosition.Line; line++ { + if _, ok := noneCodeVisitor.validLines[line]; !ok { + cleanedLines++ + start := method.File.Position(method.File.LineStart(line)) + end := method.File.Position(method.File.LineStart(line+1) - 1) + method.Tree.AddBlock( + &entity.Block{ + StartPosition: start, + EndPosition: end, + DefPosition: start, + Type: entity.TypeBlock, + Ignore: true, + }, + ) + } + } + } + if cleanedLines > 0 { + slog.Debug("Cleaned lines", "File", f.FilePath, "Lines", cleanedLines) + } + } + } + + return project +} + +type noneCodeVisitor struct { + validLines map[int]struct{} + fset *token.FileSet +} + +func (n *noneCodeVisitor) Visit(node ast.Node) (w ast.Visitor) { + if node == nil { + return n + } + + lineNUmber := n.fset.Position(node.Pos()).Line + n.validLines[lineNUmber] = struct{}{} + + return n +} diff --git a/pkg/commands/commands.go b/pkg/commands/commands.go new file mode 100644 index 0000000..6af1dbb --- /dev/null +++ b/pkg/commands/commands.go @@ -0,0 +1,150 @@ +// Package commands contains all cobra commands that are used from the main +package commands + +import ( + "fmt" + "log/slog" + "os" + "path/filepath" + + "github.com/spf13/cobra" + + "github.com/Fabianexe/gocoverageplus/pkg/cleaner" + "github.com/Fabianexe/gocoverageplus/pkg/complexity" + "github.com/Fabianexe/gocoverageplus/pkg/config" + "github.com/Fabianexe/gocoverageplus/pkg/coverage" + "github.com/Fabianexe/gocoverageplus/pkg/source" + "github.com/Fabianexe/gocoverageplus/pkg/writer" +) + +func RootCommand() { + var rootCmd = &cobra.Command{ //nolint:gochecknoglobals + Use: "gocoverageplus", + Short: "gocoverageplus optimise the coverage report in go text format with source code.", + Run: func(cmd *cobra.Command, _ []string) { + initLogger() + slog.Info("Start flag parsing") + configPath, err := cmd.Flags().GetString("config") + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + + inputPath, err := cmd.Flags().GetString("input") + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + + outputPath, err := cmd.Flags().GetString("output") + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + + slog.Info("Read Config") + conf, err := config.ReadConfig(configPath) + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + if err := conf.Validate(); err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + + sourcePath, err := filepath.Abs(conf.SourcePath) + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + + slog.Info("Load sources") + project, err := source.LoadSources(sourcePath) + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + + slog.Info("Clean data") + project = cleaner.CleanData( + project, + conf.Cleaner.Generated, + conf.Cleaner.NoneCodeLines, + conf.Cleaner.ErrorIf, + conf.Cleaner.CustomIf, + ) + + if conf.Complexity.Active { + slog.Info("Add complexity") + cyclomatic := false + if conf.Complexity.Type == "cyclomatic" { + cyclomatic = true + } + project = complexity.AddComplexity(project, cyclomatic) + } + + if inputPath != "-" { + slog.Info("Load coverage") + project, err = coverage.LoadCoverage(project, inputPath) + + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + } + + slog.Info("Write output") + if conf.OutputFormat == "cobertura" { + err = writer.WriteXML(sourcePath, project, outputPath) + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + } else if conf.OutputFormat == "textfmt" { + + err = writer.WriteTextFMT(project, outputPath) + if err != nil { + slog.Error(fmt.Sprintf("%+v", err)) + os.Exit(1) + } + } else { + slog.Error("Unknown output format") + os.Exit(1) + } + }, + } + + rootCmd.PersistentFlags().StringP( + "config", + "c", + ".cov.json", + "The config file path", + ) + + rootCmd.PersistentFlags().StringP( + "input", + "i", + "coverage.cov", + "The input file path", + ) + + rootCmd.PersistentFlags().StringP( + "output", + "o", + "coverage.xml", + "The output file path", + ) + + verboseFlag := rootCmd.PersistentFlags().VarPF( + &verbose, + "verbose", + "v", + "Add verbose output. Multiple -v options increase the verbosity.", + ) + verboseFlag.NoOptDefVal = "1" + + if err := rootCmd.Execute(); err != nil { + panic(err) + } +} diff --git a/pkg/commands/logger.go b/pkg/commands/logger.go new file mode 100644 index 0000000..a7f0e40 --- /dev/null +++ b/pkg/commands/logger.go @@ -0,0 +1,44 @@ +package commands + +import ( + "log/slog" + "os" + + "github.com/spf13/pflag" +) + +type multiFlag int + +var _ pflag.Value = (*multiFlag)(nil) + +func (m *multiFlag) String() string { + return "verbose" +} + +func (m *multiFlag) Set(_ string) error { + *m++ + + return nil +} + +func (m *multiFlag) Type() string { + return "bool" +} + +var verbose multiFlag + +func initLogger() { + var programLevel = new(slog.LevelVar) + switch verbose { + case 0: + programLevel.Set(slog.LevelError) + case 1: + programLevel.Set(slog.LevelInfo) + default: + programLevel.Set(slog.LevelDebug) + } + h := slog.NewJSONHandler(os.Stderr, &slog.HandlerOptions{ + Level: programLevel, + }) + slog.SetDefault(slog.New(h)) +} diff --git a/pkg/complexity/cognitive.go b/pkg/complexity/cognitive.go new file mode 100644 index 0000000..617df63 --- /dev/null +++ b/pkg/complexity/cognitive.go @@ -0,0 +1,44 @@ +package complexity + +import ( + "go/ast" +) + +func getCognitiveComplexity(root ast.Node) int { + visitor := &cognitiveVisitor{ + complexity: 1, + } + + ast.Walk(visitor, root) + + return visitor.complexity + +} + +type cognitiveVisitor struct { + complexity int + level []ast.Node +} + +func (c *cognitiveVisitor) Visit(node ast.Node) (w ast.Visitor) { + if node == nil { + return c + } + + for len(c.level) > 0 && c.level[len(c.level)-1].End() < node.Pos() { + c.level = c.level[:len(c.level)-1] + } + switch node.(type) { + case *ast.IfStmt, + *ast.ForStmt, + *ast.RangeStmt, + *ast.FuncDecl, + *ast.SwitchStmt, + *ast.TypeSwitchStmt, + *ast.SelectStmt: + c.complexity += len(c.level) + 1 // function is the missing level + c.level = append(c.level, node) + } + + return c +} diff --git a/pkg/complexity/complexity.go b/pkg/complexity/complexity.go new file mode 100644 index 0000000..e13f2d7 --- /dev/null +++ b/pkg/complexity/complexity.go @@ -0,0 +1,30 @@ +// Package complexity enriches the enties with complexity metrics +package complexity + +import ( + "log/slog" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +// AddComplexity adds complexity metrics to the packages +func AddComplexity(project *entity.Project, useCyclomaticComplexity bool) *entity.Project { + if useCyclomaticComplexity { + slog.Info("Use cyclomatic complexity") + } else { + slog.Info("Use cognitive complexity") + } + for _, p := range project.Packages { + for _, f := range p.Files { + for _, method := range f.Methods { + if useCyclomaticComplexity { + method.Complexity = getCyclomaticComplexity(method.Body) + } else { + method.Complexity = getCognitiveComplexity(method.Body) + } + } + } + } + + return project +} diff --git a/pkg/complexity/cyclomatic.go b/pkg/complexity/cyclomatic.go new file mode 100644 index 0000000..01d1850 --- /dev/null +++ b/pkg/complexity/cyclomatic.go @@ -0,0 +1,35 @@ +package complexity + +import ( + "go/ast" +) + +func getCyclomaticComplexity(root ast.Node) int { + visitor := &cyclomaticVisitor{ + complexity: 1, + } + + ast.Walk(visitor, root) + + return visitor.complexity + +} + +type cyclomaticVisitor struct { + complexity int +} + +func (c *cyclomaticVisitor) Visit(node ast.Node) (w ast.Visitor) { + switch node.(type) { + case *ast.IfStmt, + *ast.ForStmt, + *ast.RangeStmt, + *ast.FuncDecl, + *ast.SwitchStmt, + *ast.TypeSwitchStmt, + *ast.SelectStmt: + c.complexity++ + } + + return c +} diff --git a/pkg/config/config.go b/pkg/config/config.go new file mode 100644 index 0000000..99a0dce --- /dev/null +++ b/pkg/config/config.go @@ -0,0 +1,57 @@ +package config + +import ( + "encoding/json" + "fmt" + "os" + "slices" +) + +type Config struct { + OutputFormat string + SourcePath string + Cleaner struct { + ErrorIf bool + NoneCodeLines bool + Generated bool + CustomIf []string + } + Complexity struct { + Active bool + Type string + } +} + +func ReadConfig(path string) (Config, error) { + // Read config from file + content, err := os.ReadFile(path) + if err != nil { + return Config{}, err + } + + c := Config{} + if err := json.Unmarshal(content, &c); err != nil { + return Config{}, err + } + + return c, nil +} + +func (c *Config) Validate() error { + // Validate config + if !slices.Contains([]string{"textfmt", "cobertura"}, c.OutputFormat) { + return fmt.Errorf("output format must be one of textfmt or cobertura") + } + + if c.SourcePath == "" { + return fmt.Errorf("source path is empty") + } + + if c.Complexity.Active { + if !slices.Contains([]string{"cyclomatic", "cognitive"}, c.Complexity.Type) { + return fmt.Errorf("complexity type must be one of cyclomatic or cognitive") + } + } + + return nil +} diff --git a/pkg/coverage/coverage.go b/pkg/coverage/coverage.go new file mode 100644 index 0000000..0024642 --- /dev/null +++ b/pkg/coverage/coverage.go @@ -0,0 +1,92 @@ +// Package coverage loads a golang coverage report and enrich the entities with the information +package coverage + +import ( + "log/slog" + "path/filepath" + "strings" + + "golang.org/x/tools/cover" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +// LoadCoverage loads the coverage data from the given file +func LoadCoverage(project *entity.Project, coverageReport string) (*entity.Project, error) { + profiles, err := cover.ParseProfiles(coverageReport) + if err != nil { + return nil, err + } + + for _, p := range profiles { + slog.Debug("Profile", "Path", p.FileName, "Blocks", len(p.Blocks)) + found := false + for _, pack := range project.Packages { + if !strings.HasPrefix(p.FileName, pack.Name) { + continue + } + found = true + filename := filepath.Base(p.FileName) + for _, f := range pack.Files { + if filepath.Base(f.FilePath) != filename { + continue + } + applyBlocks(f.Methods, p.Blocks) + } + + } + if !found { + slog.Warn("Not found source for: " + p.FileName) + } + } + + updateLineCoverage(project) + updateBranchCoverage(project) + + return project, nil +} + +func applyBlocks(methods []*entity.Method, blocks []cover.ProfileBlock) { + for _, b := range blocks { + if b.Count == 0 { + continue + } + for _, method := range methods { + method.Tree.AddCoverageBlock(b) + } + } +} + +func updateLineCoverage(project *entity.Project) { + for _, pack := range project.Packages { + for _, f := range pack.Files { + for _, method := range f.Methods { + for _, line := range method.GetLines() { + isCovered := line.CoverageCount > 0 + method.LineCoverage.AddLine(isCovered) + f.LineCoverage.AddLine(isCovered) + pack.LineCoverage.AddLine(isCovered) + project.LineCoverage.AddLine(isCovered) + + } + } + } + } +} + +func updateBranchCoverage(project *entity.Project) { + for _, pack := range project.Packages { + for _, f := range pack.Files { + for _, method := range f.Methods { + for _, branch := range method.GetBranches() { + isCovered := branch.Covered + method.BranchCoverage.AddBranch(isCovered) + f.BranchCoverage.AddBranch(isCovered) + pack.BranchCoverage.AddBranch(isCovered) + project.BranchCoverage.AddBranch(isCovered) + + } + } + } + } +} diff --git a/pkg/entity/block.go b/pkg/entity/block.go new file mode 100644 index 0000000..9c10791 --- /dev/null +++ b/pkg/entity/block.go @@ -0,0 +1,118 @@ +package entity + +import ( + "go/token" + + "golang.org/x/tools/cover" +) + +type BlockType uint8 + +const ( + TypeAtomic BlockType = iota + TypeBlock + TypeBranch +) + +type Block struct { + StartPosition token.Position + EndPosition token.Position + DefPosition token.Position + Coverage []cover.ProfileBlock + Type BlockType + Children []*Block + Parent *Block + Ignore bool +} + +func (b *Block) AddCoverageBlock(block cover.ProfileBlock) { + if block.EndLine < b.StartPosition.Line || + block.EndLine == b.StartPosition.Line && block.EndCol <= b.StartPosition.Column || + block.StartLine > b.EndPosition.Line || + block.StartLine == b.EndPosition.Line && block.StartCol >= b.EndPosition.Column { + return + } + if b.Type == TypeAtomic { + if len(b.Coverage) == 0 { + b.Coverage = append(b.Coverage, cover.ProfileBlock{ + StartLine: b.StartPosition.Line, + StartCol: b.StartPosition.Column, + EndLine: b.EndPosition.Line, + EndCol: b.EndPosition.Column, + }) + } + b.Coverage[0].Count = max(b.Coverage[0].Count, block.Count) + } else { + b.Coverage = append(b.Coverage, block) + } + for _, child := range b.Children { + child.AddCoverageBlock(block) + } +} + +func (b *Block) AddBlock(newB *Block) { + if len(b.Children) == 0 { + b.Children = []*Block{newB} + + return + } + + newChilds := make([]*Block, 0, len(b.Children)+1) + added := false + for _, child := range b.Children { + // child is part of new block so ignore it + if child.StartPosition.Offset >= newB.StartPosition.Offset && + child.EndPosition.Offset <= newB.EndPosition.Offset { + continue + } + // new Block is part of child so add it there: + if !added && + child.StartPosition.Offset < newB.StartPosition.Offset && + child.EndPosition.Offset > newB.EndPosition.Offset { + child.AddBlock(newB) + added = true + } + // new Block is not added and before current child + if !added && + child.StartPosition.Offset > newB.EndPosition.Offset { + newB.Parent = b + newChilds = append(newChilds, newB) + added = true + } + + newChilds = append(newChilds, child) + } + if !added { + newB.Parent = b + newChilds = append(newChilds, newB) + } + + b.Children = newChilds +} + +func (b *Block) createProfileBlock( + startPos token.Position, + endPos token.Position, + statements int, +) cover.ProfileBlock { + cov := 0 + for _, block := range b.Coverage { + if block.EndLine < startPos.Line || + block.EndLine == startPos.Line && block.EndCol <= startPos.Column || + block.StartLine > endPos.Line || + block.StartLine == endPos.Line && block.StartCol >= endPos.Column { + continue + } + + cov = max(cov, block.Count) + } + + return cover.ProfileBlock{ + StartLine: startPos.Line, + StartCol: startPos.Column, + EndLine: endPos.Line, + EndCol: endPos.Column, + NumStmt: statements, + Count: cov, + } +} diff --git a/pkg/entity/counter.go b/pkg/entity/counter.go new file mode 100644 index 0000000..645ed9d --- /dev/null +++ b/pkg/entity/counter.go @@ -0,0 +1,62 @@ +package entity + +import ( + "fmt" + "strconv" +) + +type LineCounter struct { + totalLines int + coveredLines int +} + +func (c *LineCounter) AddLine(covered bool) { + c.totalLines++ + if covered { + c.coveredLines++ + } +} + +func (c *LineCounter) String() string { + if c.totalLines == 0 { + return "1.00" + } + + return fmt.Sprintf("%.2f", float64(c.coveredLines)/float64(c.totalLines)) +} + +func (c *LineCounter) ValidString() string { + return strconv.Itoa(c.totalLines) +} + +func (c *LineCounter) CoveredString() string { + return strconv.Itoa(c.coveredLines) +} + +type BranchCounter struct { + totalBranches int + coveredBranches int +} + +func (b *BranchCounter) AddBranch(covered bool) { + b.totalBranches++ + if covered { + b.coveredBranches++ + } +} + +func (b *BranchCounter) String() string { + if b.totalBranches == 0 { + return "1.00" + } + + return fmt.Sprintf("%.2f", float64(b.coveredBranches)/float64(b.totalBranches)) +} + +func (b *BranchCounter) ValidString() string { + return strconv.Itoa(b.totalBranches) +} + +func (b *BranchCounter) CoveredString() string { + return strconv.Itoa(b.coveredBranches) +} diff --git a/pkg/entity/coverage.go b/pkg/entity/coverage.go new file mode 100644 index 0000000..8cf3a88 --- /dev/null +++ b/pkg/entity/coverage.go @@ -0,0 +1,174 @@ +package entity + +import ( + "go/ast" + "go/token" + "slices" + + "golang.org/x/tools/cover" +) + +func (m *Method) GetCover() []cover.ProfileBlock { + if m.cover != nil { + return m.cover + } + coverBlocks := m.generateCover() + vis := &posGatherVisitor{ + lines: make(map[int][]token.Pos), + file: m.File, + } + ast.Walk(vis, m.Body) + minLines, maxLines := vis.getLines() + + retBlocks := make([]cover.ProfileBlock, 0, len(coverBlocks)) + for _, block := range coverBlocks { + if block.NumStmt > 0 { + maxLine, ok := maxLines[block.StartLine] + if !ok || maxLine < block.StartCol { + pos := m.File.Position(m.File.LineStart(block.StartLine + 1)) + block.StartLine = pos.Line + block.StartCol = pos.Column + } + minLine, ok := minLines[block.EndLine] + if !ok || minLine > block.EndCol { + pos := m.File.Position(m.File.LineStart(block.EndLine) - 1) + block.EndLine = pos.Line + block.EndCol = pos.Column + } + + retBlocks = append(retBlocks, block) + } + } + + m.cover = retBlocks + + return m.cover +} + +func (m *Method) generateCover() []cover.ProfileBlock { + coverBlocks, _, _, _, _ := m.internalCover(m.Tree) + + return coverBlocks +} + +func (m *Method) internalCover(b *Block) (results []cover.ProfileBlock, cutFrom, cutTo token.Position, statmentsBefore, statmentsAfter int) { + if b.Ignore { + cutFrom = b.DefPosition + cutTo = b.EndPosition + + return + } + switch b.Type { + case TypeBlock: + return m.generateCoverBlock(b) + case TypeBranch: + return m.generateCoverBlock(b) + case TypeAtomic: + return m.generateCoverAtomic(b) + } + + panic("Unknown block type") +} + +func (m *Method) generateCoverBlock(b *Block) (results []cover.ProfileBlock, cutFrom, cutTo token.Position, statmentsBefore, statmentsAfter int) { + cutFrom = b.StartPosition + cutTo = b.EndPosition + results = make([]cover.ProfileBlock, 0, 128) + lastEndPos := m.movePos(b.StartPosition, 1) + for _, child := range b.Children { + childResults, childCutFrom, childCutTo, childStatmentsBefore, childStatmentsAfter := m.internalCover(child) + // valid cut + if childCutFrom.Line > 0 { + if childCutFrom.Offset > lastEndPos.Offset { + results = append( + results, + b.createProfileBlock(lastEndPos, childCutFrom, statmentsAfter+childStatmentsBefore), + ) + } + lastEndPos = childCutTo + statmentsAfter = 0 + } + statmentsAfter += childStatmentsAfter + results = append(results, childResults...) + } + + if lastEndPos.Offset < b.EndPosition.Offset { + results = append( + results, + b.createProfileBlock(lastEndPos, m.movePos(b.EndPosition, -1), statmentsAfter), + ) + statmentsAfter = 0 + } + + return +} + +func (m *Method) generateCoverAtomic(b *Block) (results []cover.ProfileBlock, cutFrom, cutTo token.Position, statmentsBefore, statmentsAfter int) { + if len(b.Children) == 0 { + statmentsAfter = 1 + + return + } + + cutTo = b.EndPosition + results = make([]cover.ProfileBlock, 0, 128) + isFirst := true + for _, child := range b.Children { + childResults, childCutFrom, childCutTo, childStatmentsBefore, childStatmentsAfter := m.internalCover(child) + // valid cut + if childCutFrom.Line > 0 { + if isFirst { + statmentsBefore = statmentsAfter + childStatmentsBefore + isFirst = false + cutFrom = childCutFrom + } else { + results = append( + results, + b.createProfileBlock(cutTo, childCutFrom, statmentsAfter+childStatmentsBefore), + ) + } + + cutTo = childCutTo + statmentsAfter = 0 + } + statmentsAfter += childStatmentsAfter + results = append(results, childResults...) + } + + return +} + +func (m *Method) movePos(pos token.Position, move int) token.Position { + p := m.File.Pos(pos.Offset + move) + + return m.File.Position(p) +} + +type posGatherVisitor struct { + lines map[int][]token.Pos + file *token.File +} + +func (n *posGatherVisitor) Visit(node ast.Node) (w ast.Visitor) { + if node == nil { + return n + } + + lineNUmber := n.file.Position(node.Pos()).Line + n.lines[lineNUmber] = append(n.lines[lineNUmber], node.Pos()) + + return n +} + +func (n *posGatherVisitor) getLines() (minLines, maxLines map[int]int) { + minLines = make(map[int]int, len(n.lines)) + maxLines = make(map[int]int, len(n.lines)) + for lines, pos := range n.lines { + minPos := slices.Min(pos) + maxPos := slices.Max(pos) + minLines[lines] = n.file.Position(minPos).Column + maxLines[lines] = n.file.Position(maxPos).Column + } + + return +} diff --git a/pkg/entity/enitity.go b/pkg/entity/enitity.go new file mode 100644 index 0000000..b2652ca --- /dev/null +++ b/pkg/entity/enitity.go @@ -0,0 +1,113 @@ +// Package entity contains all entities that are used in every component of the application +package entity + +import ( + "go/ast" + "go/token" + + "golang.org/x/tools/cover" +) + +type Project struct { + Packages []*Package + LineCoverage LineCounter + BranchCoverage BranchCounter +} + +type Package struct { + Name string + Files []*File + Fset *token.FileSet + LineCoverage LineCounter + BranchCoverage BranchCounter +} + +type File struct { + Name string + FilePath string + Ast *ast.File + Methods []*Method + LineCoverage LineCounter + BranchCoverage BranchCounter +} + +type Method struct { + Name string + Body *ast.BlockStmt + Tree *Block + LineCoverage LineCounter + BranchCoverage BranchCounter + Complexity int + File *token.File + cover []cover.ProfileBlock + branches []*Branch + lines []*Line +} + +type Line struct { + Number int + CoverageCount int +} + +type Branch struct { + DefLine int + Covered bool +} + +func (m *Method) GetBranches() []*Branch { + if m.branches == nil { + m.branches = m.getBranches(m.Tree) + } + + return m.branches +} + +func (m *Method) getBranches(b *Block) []*Branch { + branches := make([]*Branch, 0, 128) + if b.Type == TypeBranch { + covered := false + if len(b.Coverage) > 0 { + covered = true + } + branches = append(branches, &Branch{ + DefLine: b.DefPosition.Line, + Covered: covered, + }) + } + + for _, child := range b.Children { + branches = append(branches, m.getBranches(child)...) + } + + return branches +} + +func (m *Method) GetLines() []*Line { + if m.lines == nil { + m.lines = m.getLines() + } + + return m.lines +} + +func (m *Method) getLines() []*Line { + lines := make([]*Line, 0, m.Tree.EndPosition.Line-m.Tree.DefPosition.Line+1) + covers := m.GetCover() + dict := make(map[int]*Line, cap(lines)) + for _, c := range covers { + for i := c.StartLine; i <= c.EndLine; i++ { + if existing, ok := dict[i]; !ok { + l := &Line{ + Number: i, + CoverageCount: c.Count, + } + dict[i] = l + lines = append(lines, l) + } else { + existing.CoverageCount = max(existing.CoverageCount, c.Count) + } + } + } + + return lines +} diff --git a/pkg/source/branch.go b/pkg/source/branch.go new file mode 100644 index 0000000..74bbdc3 --- /dev/null +++ b/pkg/source/branch.go @@ -0,0 +1,107 @@ +package source + +import ( + "go/ast" + "go/token" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +type branchVisitor struct { + blocks []*entity.Block + fset *token.FileSet +} + +func (b *branchVisitor) Visit(node ast.Node) (w ast.Visitor) { + switch v := node.(type) { + // block + case *ast.FuncDecl: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Body.Pos(), v.Body.End(), entity.TypeBlock)) + case *ast.FuncLit: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.Body.Pos(), entity.TypeAtomic)) + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Body.Pos(), v.Body.End(), entity.TypeBlock)) + case *ast.BlockStmt: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.End(), entity.TypeBlock)) + // branch + case *ast.CaseClause: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.Colon, entity.TypeAtomic)) + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Colon, v.End(), entity.TypeBranch)) + case *ast.CommClause: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.Colon, entity.TypeAtomic)) + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Colon, v.End(), entity.TypeBranch)) + case *ast.IfStmt: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.Body.Pos(), entity.TypeAtomic)) + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Body.Pos(), v.Body.End(), entity.TypeBranch)) + case *ast.ForStmt: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.Body.Pos(), entity.TypeAtomic)) + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Body.Pos(), v.Body.End(), entity.TypeBranch)) + case *ast.RangeStmt: + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.Body.Pos(), entity.TypeAtomic)) + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Body.Pos(), v.Body.End(), entity.TypeBranch)) + // atomic + default: + if v == nil { + return b + } + + b.blocks = append(b.blocks, b.createBranch(v.Pos(), v.Pos(), v.End(), entity.TypeAtomic)) + } + + return b +} + +func (b *branchVisitor) createBranch(def, start, end token.Pos, t entity.BlockType) *entity.Block { + return &entity.Block{ + DefPosition: b.fset.Position(def), + StartPosition: b.fset.Position(start), + EndPosition: b.fset.Position(end), + Type: t, + } +} + +func (b *branchVisitor) getBlocks() []*entity.Block { + retBlocks := make([]*entity.Block, 0, len(b.blocks)) + last := &entity.Block{} + for _, block := range b.blocks { + if block.DefPosition.Line == last.DefPosition.Line && block.EndPosition.Line == last.EndPosition.Line { + if last.Type < block.Type { + retBlocks[len(retBlocks)-1] = block + last = block + } + continue + } + retBlocks = append(retBlocks, block) + last = block + } + + return retBlocks +} + +func (b *branchVisitor) getTree() *entity.Block { + blocks := b.getBlocks() + rootB := blocks[0] + root := &entity.Block{ + StartPosition: rootB.StartPosition, + EndPosition: rootB.EndPosition, + DefPosition: rootB.DefPosition, + Type: rootB.Type, + } + + last := root + for _, block := range blocks[1:] { + for block.EndPosition.Line > last.EndPosition.Line { + last = last.Parent + } + newBlock := &entity.Block{ + StartPosition: block.StartPosition, + EndPosition: block.EndPosition, + DefPosition: block.DefPosition, + Type: block.Type, + Parent: last, + } + last.Children = append(last.Children, newBlock) + last = newBlock + } + + return root +} diff --git a/pkg/source/source.go b/pkg/source/source.go new file mode 100644 index 0000000..cde86ac --- /dev/null +++ b/pkg/source/source.go @@ -0,0 +1,147 @@ +// Package source reads source files from a given path and create entities from this +package source + +import ( + "go/ast" + "log/slog" + "os" + "path/filepath" + + "golang.org/x/tools/go/packages" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +func LoadSources(path string) (*entity.Project, error) { + goPackages, err := getGoPaths(path) + if err != nil { + return nil, err + } + + cfg := &packages.Config{ + Mode: packages.NeedName | + packages.NeedFiles | + packages.NeedSyntax | + packages.NeedTypesInfo | + packages.NeedModule | + packages.NeedTypes, + Dir: path, + } + + pkgs, err := packages.Load(cfg, goPackages...) + if err != nil { + return nil, err + } + + var countPackages, countFiles, countMethods int + allPackages := make([]*entity.Package, 0, len(pkgs)) + for _, pkg := range pkgs { + pack := &entity.Package{ + Name: pkg.PkgPath, + Files: make([]*entity.File, 0, len(pkg.Syntax)), + Fset: pkg.Fset, + } + + slog.Debug("Package", "Path", pkg.PkgPath, "Files", len(pkg.Syntax)) + for i, fileAst := range pkg.Syntax { + methodsMap := make(map[string][]*entity.Method) + for _, decl := range fileAst.Decls { + if fun, ok := decl.(*ast.FuncDecl); ok { + method := &entity.Method{ + Name: fun.Name.Name, + Body: fun.Body, + File: pkg.Fset.File(fun.Pos()), + } + + // start after the function declaration + startLine := pkg.Fset.Position(fun.Body.Lbrace).Line + 1 + endLine := pkg.Fset.Position(fun.End()).Line + if startLine >= endLine { + continue + } + + bV := &branchVisitor{ + fset: pkg.Fset, + } + + ast.Walk(bV, fun) + + method.Tree = bV.getTree() + + countMethods++ + className := getClassName(fun) + methodsMap[className] = append(methodsMap[className], method) + } + } + + var methodCount int + for className, methods := range methodsMap { + file := &entity.File{ + Name: className, + FilePath: pkg.GoFiles[i], + Ast: fileAst, + Methods: methods, + } + pack.Files = append(pack.Files, file) + + methodCount += len(methods) + } + + slog.Debug("File", "Name", filepath.Base(pkg.GoFiles[i]), "Methods", methodCount) + + countFiles++ + } + + countPackages++ + allPackages = append(allPackages, pack) + } + slog.Info("Source reading Finished", "Packages", countPackages, " Files", countFiles, " Methods", countMethods) + return &entity.Project{Packages: allPackages}, nil +} + +func getGoPaths(path string) ([]string, error) { + goPath := make(map[string]struct{}, 1000) + err := filepath.Walk(path, func(path string, info os.FileInfo, err error) error { + if filepath.Ext(path) == ".go" { + goPath[filepath.Dir(path)] = struct{}{} + } + return nil + }) + if err != nil { + return nil, err + } + + goPackages := make([]string, 0, len(goPath)) + for pack := range goPath { + goPackages = append(goPackages, pack) + } + return goPackages, nil +} + +func getClassName(fun *ast.FuncDecl) string { + if fun.Recv == nil { + return "-" + } + + if star, ok := fun.Recv.List[0].Type.(*ast.StarExpr); ok { + if index, ok := star.X.(*ast.IndexExpr); ok { + return index.X.(*ast.Ident).Name + } + + if index, ok := star.X.(*ast.IndexListExpr); ok { + return index.X.(*ast.Ident).Name + } + + return star.X.(*ast.Ident).Name + } + + if index, ok := fun.Recv.List[0].Type.(*ast.IndexExpr); ok { + return index.X.(*ast.Ident).Name + } + + if index, ok := fun.Recv.List[0].Type.(*ast.IndexListExpr); ok { + return index.X.(*ast.Ident).Name + } + + return fun.Recv.List[0].Type.(*ast.Ident).Name +} diff --git a/pkg/writer/condition_coverage.go b/pkg/writer/condition_coverage.go new file mode 100644 index 0000000..8d5b8a7 --- /dev/null +++ b/pkg/writer/condition_coverage.go @@ -0,0 +1,22 @@ +package writer + +import ( + "fmt" +) + +type conditionCoverage struct { + total int + covered int +} + +func (c *conditionCoverage) Add(covered bool) { + c.total++ + + if covered { + c.covered++ + } +} + +func (c *conditionCoverage) String() string { + return fmt.Sprintf("%.2f (%d/%d)", float64(c.covered)/float64(c.total), c.covered, c.total) +} diff --git a/pkg/writer/converter.go b/pkg/writer/converter.go new file mode 100644 index 0000000..fb0fa64 --- /dev/null +++ b/pkg/writer/converter.go @@ -0,0 +1,139 @@ +package writer + +import ( + "strconv" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +func ConvertToCobertura(path string, project *entity.Project) *Coverage { + pkgs := project.Packages + coverage := &Coverage{ + Sources: &Sources{ + Sources: []*Source{ + { + Path: path, + }, + }, + }, + LineRate: project.LineCoverage.String(), + BranchRate: project.BranchCoverage.String(), + LinesValid: project.LineCoverage.ValidString(), + LinesCovered: project.LineCoverage.CoveredString(), + BranchesValid: project.BranchCoverage.ValidString(), + BranchesCovered: project.BranchCoverage.CoveredString(), + } + + packages := &Packages{ + Packages: make([]*Package, 0, len(pkgs)), + } + totalComplexity := 0 + for _, pkg := range pkgs { + packageCov := &Package{ + Name: pkg.Name, + LineRate: pkg.LineCoverage.String(), + BranchRate: pkg.BranchCoverage.String(), + } + + packageComplexity := 0 + + classes := &Classes{ + Classes: make([]*Class, 0, len(pkg.Files)), + } + + for _, file := range pkg.Files { + class := &Class{ + Name: file.Name, + Filename: file.FilePath, + LineRate: file.LineCoverage.String(), + BranchRate: file.BranchCoverage.String(), + } + + classComplexity := 0 + + methods := &Methods{ + Methods: make([]*Method, 0, len(file.Methods)), + } + + classLines := &Lines{ + Lines: make([]*Line, 0, 1024), + } + + for _, method := range file.Methods { + xmlMethod := &Method{ + Name: method.Name, + LineRate: method.LineCoverage.String(), + BranchRate: method.BranchCoverage.String(), + Complexity: strconv.Itoa(method.Complexity), + } + + totalComplexity += method.Complexity + packageComplexity += method.Complexity + classComplexity += method.Complexity + + branches := method.GetBranches() + conditions := make(map[int]*conditionCoverage, len(branches)) + for _, branch := range branches { + if condition, ok := conditions[branch.DefLine]; ok { + condition.Add(branch.Covered) + } else { + condition := &conditionCoverage{} + condition.Add(branch.Covered) + conditions[branch.DefLine] = condition + } + } + + lines := method.GetLines() + methodsLines := &Lines{ + Lines: make([]*Line, 0, len(lines)), + } + for _, line := range lines { + xmlLine := &Line{ + Number: strconv.Itoa(line.Number), + Hits: strconv.Itoa(line.CoverageCount), + Branch: "false", + } + if condition, ok := conditions[line.Number]; ok { + xmlLine.Branch = "true" + xmlLine.ConditionCoverage = condition.String() + } + + methodsLines.Lines = append(methodsLines.Lines, xmlLine) + classLines.Lines = append(classLines.Lines, xmlLine) + } + + if len(methodsLines.Lines) != 0 { + xmlMethod.Lines = methodsLines + } + + methods.Methods = append(methods.Methods, xmlMethod) + } + if len(methods.Methods) != 0 { + class.Methods = methods + } + + if len(classLines.Lines) != 0 { + class.Lines = classLines + } + + class.Complexity = strconv.Itoa(classComplexity) + + classes.Classes = append(classes.Classes, class) + } + + if len(classes.Classes) != 0 { + packageCov.Classes = classes + } + + packageCov.Complexity = strconv.Itoa(packageComplexity) + + packages.Packages = append(packages.Packages, packageCov) + } + if len(packages.Packages) != 0 { + coverage.Packages = packages + } + + coverage.Complexity = strconv.Itoa(totalComplexity) + + return coverage +} diff --git a/pkg/writer/writer.go b/pkg/writer/writer.go new file mode 100644 index 0000000..0f07d36 --- /dev/null +++ b/pkg/writer/writer.go @@ -0,0 +1,79 @@ +// Package writer writes xml file based on the cobertura dtd: +// https://github.com/cobertura/cobertura/blob/master/cobertura/src/test/resources/dtds/coverage-04.dtd +package writer + +import ( + "encoding/xml" + "fmt" + "log/slog" + "os" + "path" + + "github.com/Fabianexe/gocoverageplus/pkg/entity" +) + +func WriteXML(path string, project *entity.Project, outPath string) error { + xmlCoverage := ConvertToCobertura(path, project) + + outFile, err := os.Create(outPath) + if err != nil { + return err + } + + encoder := xml.NewEncoder(outFile) + encoder.Indent("", "\t") + + slog.Info("Write coverage to file", "Path", outPath) + err = encoder.Encode(xmlCoverage) + if err != nil { + return err + } + + if err := outFile.Close(); err != nil { + return err + } + + return nil +} + +func WriteTextFMT(project *entity.Project, outPath string) error { + outFile, err := os.Create(outPath) + if err != nil { + return err + } + + slog.Info("Write coverage to file", "Path", outPath) + if _, err := outFile.WriteString("mode: set\n"); err != nil { + return err + } + for _, pkg := range project.Packages { + for _, file := range pkg.Files { + for _, method := range file.Methods { + for _, block := range method.GetCover() { + if _, err := outFile.WriteString( + fmt.Sprintf( + "%s/%s:%d.%d,%d.%d %d %d\n", + pkg.Name, + path.Base(file.FilePath), + block.StartLine, + block.StartCol, + block.EndLine, + block.EndCol, + block.NumStmt, + block.Count, + ), + ); err != nil { + return err + } + } + + } + } + } + + if err := outFile.Close(); err != nil { + return err + } + + return nil +} diff --git a/pkg/writer/xml.go b/pkg/writer/xml.go new file mode 100644 index 0000000..ef267ac --- /dev/null +++ b/pkg/writer/xml.go @@ -0,0 +1,95 @@ +package writer + +import ( + "encoding/xml" +) + +type Coverage struct { + XMLName xml.Name `xml:"coverage"` + + Sources *Sources `xml:"sources"` + Packages *Packages `xml:"packages"` + + LineRate string `xml:"line-rate,attr"` + BranchRate string `xml:"branch-rate,attr"` + LinesCovered string `xml:"lines-covered,attr"` + LinesValid string `xml:"lines-valid,attr"` + BranchesCovered string `xml:"branches-covered,attr"` + BranchesValid string `xml:"branches-valid,attr"` + Complexity string `xml:"complexity,attr"` + Version string `xml:"version,attr"` + Timestamp string `xml:"timestamp,attr"` +} + +type Sources struct { + Sources []*Source `xml:"source"` +} + +type Source struct { + Path string `xml:",chardata"` +} + +type Packages struct { + Packages []*Package `xml:"package"` +} + +type Package struct { + Classes *Classes `xml:"classes"` + + Name string `xml:"name,attr"` + LineRate string `xml:"line-rate,attr"` + BranchRate string `xml:"branch-rate,attr"` + Complexity string `xml:"complexity,attr"` +} + +type Classes struct { + Classes []*Class `xml:"class"` +} + +type Class struct { + Methods *Methods `xml:"methods"` + Lines *Lines `xml:"lines"` + + Name string `xml:"name,attr"` + Filename string `xml:"filename,attr"` + LineRate string `xml:"line-rate,attr"` + BranchRate string `xml:"branch-rate,attr"` + Complexity string `xml:"complexity,attr"` +} + +type Methods struct { + Methods []*Method `xml:"method"` +} + +type Method struct { + Lines *Lines `xml:"lines"` + + Name string `xml:"name,attr"` + Signature string `xml:"signature,attr"` + LineRate string `xml:"line-rate,attr"` + BranchRate string `xml:"branch-rate,attr"` + Complexity string `xml:"complexity,attr"` +} + +type Lines struct { + Lines []*Line `xml:"line"` +} + +type Line struct { + Conditions *Conditions `xml:"conditions"` + + Number string `xml:"number,attr"` + Hits string `xml:"hits,attr"` + Branch string `xml:"branch,attr"` + ConditionCoverage string `xml:"condition-coverage,attr"` +} + +type Conditions struct { + Methods []*Condition `xml:"condition"` +} + +type Condition struct { + Number string `xml:"number,attr"` + Type string `xml:"type,attr"` + Coverage string `xml:"coverage,attr"` +}