Skip to content

Commit

Permalink
Merge pull request #3 from ambarltd/ci_acceptance_testing
Browse files Browse the repository at this point in the history
Updates to allow partial state capture in event of interrupt, updated DataSource map field name in example.
  • Loading branch information
tjschutte authored Feb 2, 2024
2 parents 0dab832 + f241e05 commit 7f419b7
Show file tree
Hide file tree
Showing 15 changed files with 79 additions and 100 deletions.
3 changes: 2 additions & 1 deletion .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,8 @@ jobs:
name: Terraform Provider Acceptance Tests
needs: build
runs-on: ubuntu-latest
timeout-minutes: 15
environment: Acceptance Testing
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
Expand Down
5 changes: 4 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
## 1.0.0 (Initial Release)

FEATURES:
* Ambar initial Terraform support.
* Ambar initial Terraform support.
* Support for Ambar DataSource resources like the Postgres DataSourceType
* Support for Ambar Filter resources, allowing to define a record sequence filter to be applied to a DataSource.
* Support for Ambar DataDestination resources, allowing delivery of one or more filtered record sequences from one or more DataDestinations
81 changes: 17 additions & 64 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,64 +1,17 @@
# Terraform Provider Scaffolding (Terraform Plugin Framework)

_This template repository is built on the [Terraform Plugin Framework](https://github.com/hashicorp/terraform-plugin-framework). The template repository built on the [Terraform Plugin SDK](https://github.com/hashicorp/terraform-plugin-sdk) can be found at [terraform-provider-scaffolding](https://github.com/hashicorp/terraform-provider-scaffolding). See [Which SDK Should I Use?](https://developer.hashicorp.com/terraform/plugin/framework-benefits) in the Terraform documentation for additional information._

This repository is a *template* for a [Terraform](https://www.terraform.io) provider. It is intended as a starting point for creating Terraform providers, containing:

- A resource and a data source (`internal/provider/`),
- Examples (`examples/`) and generated documentation (`docs/`),
- Miscellaneous meta files.

These files contain boilerplate code that you will need to edit to create your own Terraform provider. Tutorials for creating Terraform providers can be found on the [HashiCorp Developer](https://developer.hashicorp.com/terraform/tutorials/providers-plugin-framework) platform. _Terraform Plugin Framework specific guides are titled accordingly._

Please see the [GitHub template repository documentation](https://help.github.com/en/github/creating-cloning-and-archiving-repositories/creating-a-repository-from-a-template) for how to create a new repository from this template on GitHub.

Once you've written your provider, you'll want to [publish it on the Terraform Registry](https://developer.hashicorp.com/terraform/registry/providers/publishing) so that others can use it.

## Requirements

- [Terraform](https://developer.hashicorp.com/terraform/downloads) >= 1.0
- [Go](https://golang.org/doc/install) >= 1.20

## Building The Provider

1. Clone the repository
1. Enter the repository directory
1. Build the provider using the Go `install` command:

```shell
go install
```

## Adding Dependencies

This provider uses [Go modules](https://github.com/golang/go/wiki/Modules).
Please see the Go documentation for the most up to date information about using Go modules.

To add a new dependency `github.com/author/dependency` to your Terraform provider:

```shell
go get github.com/author/dependency
go mod tidy
```

Then commit the changes to `go.mod` and `go.sum`.

## Using the provider

Fill this in for each provider

## Developing the Provider

If you wish to work on the provider, you'll first need [Go](http://www.golang.org) installed on your machine (see [Requirements](#requirements) above).

To compile the provider, run `go install`. This will build the provider and put the provider binary in the `$GOPATH/bin` directory.

To generate or update documentation, run `go generate`.

In order to run the full suite of Acceptance tests, run `make testacc`.

*Note:* Acceptance tests create real resources, and often cost money to run.

```shell
make testacc
```
<!-- markdownlint-disable first-line-h1 no-inline-html -->
<a href="https://terraform.io">
<picture>
<source media="(prefers-color-scheme: dark)" srcset=".github/terraform_logo_dark.svg">
<source media="(prefers-color-scheme: light)" srcset=".github/terraform_logo_light.svg">
<img src=".github/terraform_logo_light.svg" alt="Terraform logo" title="Terraform" align="right" height="50">
</picture>
</a>

# Terraform Ambar Provider

The [Ambar Provider](https://registry.terraform.io/providers/ambarltd/ambar/latest/docs) allows [Terraform](https://terraform.io) to manage [Ambar](https://ambar.cloud) resources.

- [Contributing guide] *coming soon*
- [FAQ] *coming soon*
- [Tutorials and Examples] *coming soon*
- [Help and Support] *coming soon*
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Interact with your regional Ambar environment.
terraform {
required_providers {
ambar = {
source = "ambar.cloud/terraform/ambar"
source = "ambarltd/ambar"
}
}
}
Expand Down
2 changes: 2 additions & 0 deletions docs/resources/data_destination.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,5 +51,7 @@ Import is supported using the following syntax:

```shell
# Ambar DataDestinations can be imported by specifying the resource identifier.
# Note: Some sensitive fields like usernames and passwords will not get imported into Terraform state
# from existing resources and may require further action to manage via Terraform templates.
terraform import ambar_data_destination.example_data_destination AMBAR-1234567890
```
8 changes: 7 additions & 1 deletion docs/resources/data_source.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,17 @@ resource "ambar_data_source" "example_data_source" {
serial_column = "serial"
username = "username"
password = "password"
# data_source_config key-values depend on the type of DataSource being created.
# See Ambar docs for more details.
data_source_config = {
"hostname" : "host",
"hostPort" : "5432",
"databaseName" : "postgres",
"tableName" : "events",
"publicationName" : "example_pub",
"additionalColumns" : "some,other,column"
# columns should include all columns to be read from the database
# including the partition and serial columns
"columns" : "partition,serial,some,other,column"
}
}
```
Expand Down Expand Up @@ -58,5 +62,7 @@ Import is supported using the following syntax:

```shell
# Ambar DataSources can be imported by specifying the resource identifier.
# Note: Some sensitive fields like usernames and passwords will not get imported into Terraform state
# from existing resources and may require further action to manage via Terraform templates.
terraform import ambar_data_source.example_data_source AMBAR-1234567890
```
2 changes: 1 addition & 1 deletion examples/provider/provider.tf
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
terraform {
required_providers {
ambar = {
source = "ambar.cloud/terraform/ambar"
source = "ambarltd/ambar"
}
}
}
Expand Down
2 changes: 2 additions & 0 deletions examples/resources/ambar_data_destination/import.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
# Ambar DataDestinations can be imported by specifying the resource identifier.
# Note: Some sensitive fields like usernames and passwords will not get imported into Terraform state
# from existing resources and may require further action to manage via Terraform templates.
terraform import ambar_data_destination.example_data_destination AMBAR-1234567890
2 changes: 2 additions & 0 deletions examples/resources/ambar_data_source/import.sh
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
# Ambar DataSources can be imported by specifying the resource identifier.
# Note: Some sensitive fields like usernames and passwords will not get imported into Terraform state
# from existing resources and may require further action to manage via Terraform templates.
terraform import ambar_data_source.example_data_source AMBAR-1234567890
6 changes: 5 additions & 1 deletion examples/resources/ambar_data_source/resource.tf
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,16 @@ resource "ambar_data_source" "example_data_source" {
serial_column = "serial"
username = "username"
password = "password"
# data_source_config key-values depend on the type of DataSource being created.
# See Ambar docs for more details.
data_source_config = {
"hostname" : "host",
"hostPort" : "5432",
"databaseName" : "postgres",
"tableName" : "events",
"publicationName" : "example_pub",
"additionalColumns" : "some,other,column"
# columns should include all columns to be read from the database
# including the partition and serial columns
"columns" : "partition,serial,some,other,column"
}
}
13 changes: 10 additions & 3 deletions internal/provider/data_destination_resource.go
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,12 @@ func (r *DataDestinationResource) Create(ctx context.Context, req resource.Creat
return
}

plan.ResourceId = types.StringValue(createResourceResponse.ResourceId)
plan.State = types.StringValue(createResourceResponse.ResourceState)

diags = resp.State.Set(ctx, plan)
resp.Diagnostics.Append(diags...)

var describeDataDestination Ambar.DescribeResourceRequest
describeDataDestination.ResourceId = createResourceResponse.ResourceId

Expand All @@ -193,9 +199,6 @@ func (r *DataDestinationResource) Create(ctx context.Context, req resource.Creat
// Set state to fully populated data
diags = resp.State.Set(ctx, plan)
resp.Diagnostics.Append(diags...)
if resp.Diagnostics.HasError() {
return
}
}

func (r *DataDestinationResource) Read(ctx context.Context, req resource.ReadRequest, resp *resource.ReadResponse) {
Expand Down Expand Up @@ -257,6 +260,10 @@ func (r *DataDestinationResource) Update(ctx context.Context, req resource.Updat
return
}

// partial state save in case of interrupt
data.State = types.StringValue(updateResourceResponse.ResourceState)
resp.Diagnostics.Append(resp.State.Set(ctx, &data)...)

// Wait for the update to complete
var describeResourceResponse *Ambar.DataDestination
var describeDataDestination Ambar.DescribeResourceRequest
Expand Down
16 changes: 12 additions & 4 deletions internal/provider/data_source_resource.go
Original file line number Diff line number Diff line change
Expand Up @@ -180,6 +180,14 @@ func (r *dataSourceResource) Create(ctx context.Context, req resource.CreateRequ
return
}

// Map response body to schema and populate Computed attribute values
plan.ResourceId = types.StringValue(createResourceResponse.ResourceId)
plan.State = types.StringValue(createResourceResponse.ResourceState)

// Set state to fully populated data
diags = resp.State.Set(ctx, plan)
resp.Diagnostics.Append(diags...)

var describeDataSource Ambar.DescribeResourceRequest
describeDataSource.ResourceId = createResourceResponse.ResourceId

Expand All @@ -199,15 +207,11 @@ func (r *dataSourceResource) Create(ctx context.Context, req resource.CreateRequ
}

// Map response body to schema and populate Computed attribute values
plan.ResourceId = types.StringValue(createResourceResponse.ResourceId)
plan.State = types.StringValue(describeResourceResponse.State)

// Set state to fully populated data
diags = resp.State.Set(ctx, plan)
resp.Diagnostics.Append(diags...)
if resp.Diagnostics.HasError() {
return
}
}

func (r *dataSourceResource) Read(ctx context.Context, req resource.ReadRequest, resp *resource.ReadResponse) {
Expand Down Expand Up @@ -274,6 +278,10 @@ func (r *dataSourceResource) Update(ctx context.Context, req resource.UpdateRequ
return
}

// partial state save in case of interrupt
data.State = types.StringValue(updateResourceResponse.ResourceState)
resp.Diagnostics.Append(resp.State.Set(ctx, &data)...)

// Wait for the update to complete
var describeDataSource Ambar.DescribeResourceRequest
describeDataSource.ResourceId = data.ResourceId.ValueString()
Expand Down
16 changes: 8 additions & 8 deletions internal/provider/data_source_resource_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -9,18 +9,18 @@ const (
exampleDataSourceConfig = `
resource "ambar_data_source" "test_data_source" {
data_source_type = "postgres"
description = "My Terraform DataSource"
partitioning_column = "partition"
serial_column = "serial"
username = "postgres"
password = "password"
description = "My Terraform Acceptance Test DataSource"
# partitioning_column = "partition"
# serial_column = "serial"
# username = "postgres"
# password = "password"
data_source_config = {
"hostname": "host",
# "hostname": "host",
"hostPort": "5432",
"databaseName": "postgres",
"tableName": "events",
"publicationName": "example_pub",
"additionalColumns": "seqid,seqnum,value"
"publicationName": "acceptance_test_pub",
# "additionalColumns": "seqid,seqnum,value"
}
}`
)
Expand Down
16 changes: 4 additions & 12 deletions internal/provider/provider_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,19 @@
package provider

import (
"testing"

"github.com/hashicorp/terraform-plugin-framework/providerserver"
"github.com/hashicorp/terraform-plugin-go/tfprotov6"
)

const (
// providerConfig is a shared configuration to combine with the actual
// test configuration so the HashiCups client is properly configured.
// It is also possible to use the HASHICUPS_ environment variables instead,
// test configuration so the Ambar client is properly configured.
// It is also possible to use the AMBAR_ environment variables instead,
// such as updating the Makefile and running the testing through that tool.
providerConfig = `
provider "ambar" {
endpoint = "region.api.ambar.cloud"
api_key = "your-key"
# endpoint = "region.api.ambar.cloud"
# api_key = "your-key"
}
`
)
Expand All @@ -30,9 +28,3 @@ const (
var testAccProtoV6ProviderFactories = map[string]func() (tfprotov6.ProviderServer, error){
"ambar": providerserver.NewProtocol6WithError(New("test")()),
}

func testAccPreCheck(t *testing.T) {
// You can add code here to run prior to any test case execution, for example assertions
// about the appropriate environment variables being set are common to see in a pre-check
// function.
}
5 changes: 2 additions & 3 deletions main.go
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import (
"flag"
"log"

// Provider framework backend server, allows us to connect to hasicorp and do local testing
// Provider framework backend server, allows us to connect to hasicorp and do local testing.
"github.com/hashicorp/terraform-plugin-framework/providerserver"

"terraform-provider-ambar/internal/provider"
Expand Down Expand Up @@ -41,8 +41,7 @@ func main() {

opts := providerserver.ServeOpts{
Address: "registry.terraform.io/hashicorp/ambar",
// Stealing this value from the tutorial for a moment. Todo: replace this with the above.
Debug: debug,
Debug: debug,
}

err := providerserver.Serve(context.Background(), provider.New(version), opts)
Expand Down

0 comments on commit 7f419b7

Please sign in to comment.