Wednesday

18-06-2025 Vol 19

Clean, Performant, and Testable: Mastering Data Access in Go with Repositories & sqlc (2)

Clean, Performant, and Testable: Mastering Data Access in Go with Repositories & sqlc (Part 2)

Introduction

In Part 1 of this series, we laid the groundwork for building a robust and maintainable data access layer in Go using the Repository pattern and sqlc. We explored the benefits of this approach, including improved code organization, testability, and performance. We also set up our development environment, defined our database schema, and generated Go code from our SQL queries using sqlc.

In this second part, we’ll dive deeper into the practical implementation of the Repository pattern. We’ll create concrete repository implementations, handle database transactions, and write comprehensive unit tests to ensure our data access layer functions correctly. We’ll also discuss common challenges and best practices for working with repositories and sqlc in Go.

This post is intended for Go developers who have a basic understanding of SQL and are looking to improve the quality and maintainability of their data access code. Familiarity with the concepts introduced in Part 1 is highly recommended.

Table of Contents

  1. Building Concrete Repositories
  2. Dependency Injection for Testability
  3. Handling Database Transactions
  4. Robust Error Handling
  5. Writing Unit Tests
  6. Implementing Pagination
  7. Common Challenges and Solutions
  8. Performance Optimization Strategies
  9. Conclusion

1. Building Concrete Repositories

Now that we’ve defined our repository interfaces, it’s time to create concrete implementations that interact with our database. These implementations will use the code generated by sqlc to execute SQL queries and map the results to Go data structures.

Let’s assume we have a simple users table with the following schema:


  CREATE TABLE users (
      id SERIAL PRIMARY KEY,
      name VARCHAR(255) NOT NULL,
      email VARCHAR(255) UNIQUE NOT NULL,
      created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW()
  );
  

And our sqlc generated code includes a User struct and functions for querying and manipulating the users table.

We’ll start by creating a file named user_repository.go. This file will contain the concrete implementation of our UserRepository interface.


  package repository

  import (
      "context"
      "database/sql"
      "fmt"

      "example.com/your-project/db/sqlc" // Replace with your actual sqlc package path
      "example.com/your-project/domain" // Replace with your actual domain package path
  )

  type UserRepository interface {
      CreateUser(ctx context.Context, arg domain.CreateUserParams) (domain.User, error)
      GetUser(ctx context.Context, id int32) (domain.User, error)
      ListUsers(ctx context.Context, arg domain.ListUsersParams) ([]domain.User, error)
      UpdateUser(ctx context.Context, arg domain.UpdateUserParams) (domain.User, error)
      DeleteUser(ctx context.Context, id int32) error
  }

  type userRepository struct {
      db *sql.DB
      queries *sqlc.Queries
  }

  func NewUserRepository(db *sql.DB) UserRepository {
      return &userRepository{
          db: db,
          queries: sqlc.New(db),
      }
  }

  func (r *userRepository) CreateUser(ctx context.Context, arg domain.CreateUserParams) (domain.User, error) {
      result, err := r.queries.CreateUser(ctx, sqlc.CreateUserParams{
          Name:  arg.Name,
          Email: arg.Email,
      })
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to create user: %w", err)
      }

      id, err := result.LastInsertId()
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to get last insert id: %w", err)
      }


	  dbUser, err := r.queries.GetUser(ctx, int32(id))
	  if err != nil {
		  return domain.User{}, fmt.Errorf("failed to get created user: %w", err)
	  }

	  return domain.User{
		  ID:        dbUser.ID,
		  Name:      dbUser.Name,
		  Email:     dbUser.Email,
		  CreatedAt: dbUser.CreatedAt,
	  }, nil
  }

  func (r *userRepository) GetUser(ctx context.Context, id int32) (domain.User, error) {
      user, err := r.queries.GetUser(ctx, id)
      if err != nil {
          if err == sql.ErrNoRows {
              return domain.User{}, domain.ErrNotFound
          }
          return domain.User{}, fmt.Errorf("failed to get user: %w", err)
      }

      return domain.User{
          ID:        user.ID,
          Name:      user.Name,
          Email:     user.Email,
          CreatedAt: user.CreatedAt,
      }, nil
  }

  func (r *userRepository) ListUsers(ctx context.Context, arg domain.ListUsersParams) ([]domain.User, error) {
      users, err := r.queries.ListUsers(ctx, sqlc.ListUsersParams{
          Limit:  arg.Limit,
          Offset: arg.Offset,
      })
      if err != nil {
          return nil, fmt.Errorf("failed to list users: %w", err)
      }

      domainUsers := make([]domain.User, len(users))
      for i, user := range users {
          domainUsers[i] = domain.User{
              ID:        user.ID,
              Name:      user.Name,
              Email:     user.Email,
              CreatedAt: user.CreatedAt,
          }
      }

      return domainUsers, nil
  }

  func (r *userRepository) UpdateUser(ctx context.Context, arg domain.UpdateUserParams) (domain.User, error) {
      err := r.queries.UpdateUser(ctx, sqlc.UpdateUserParams{
          ID:    arg.ID,
          Name:  sql.NullString{String: arg.Name, Valid: arg.Name != ""},
          Email: sql.NullString{String: arg.Email, Valid: arg.Email != ""},
      })
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to update user: %w", err)
      }

	  dbUser, err := r.queries.GetUser(ctx, arg.ID)
	  if err != nil {
		  return domain.User{}, fmt.Errorf("failed to get updated user: %w", err)
	  }

      return domain.User{
          ID:        dbUser.ID,
          Name:      dbUser.Name,
          Email:     dbUser.Email,
          CreatedAt: dbUser.CreatedAt,
      }, nil
  }

  func (r *userRepository) DeleteUser(ctx context.Context, id int32) error {
      err := r.queries.DeleteUser(ctx, id)
      if err != nil {
          return fmt.Errorf("failed to delete user: %w", err)
      }
      return nil
  }

  

Key points to note:

  • The userRepository struct holds a *sql.DB connection and a *sqlc.Queries instance. This is how we interact with the database.
  • The NewUserRepository function is a constructor that creates a new instance of userRepository. It takes a *sql.DB connection as an argument.
  • Each method in the userRepository implements the corresponding method in the UserRepository interface.
  • We use the queries object (generated by sqlc) to execute the SQL queries.
  • We handle potential errors from the database and return them to the caller. We also define a custom error domain.ErrNotFound to indicate that a user was not found.
  • We map the data returned by sqlc (sqlc.User) to our domain model (domain.User). This helps to decouple our data access layer from the specific database implementation.
  • For optional fields like `Name` and `Email` in the `UpdateUser` function, we use `sql.NullString` to handle cases where the values are not provided.

Remember to replace "example.com/your-project/db/sqlc" and "example.com/your-project/domain" with the actual paths to your sqlc and domain packages.

2. Dependency Injection for Testability

Dependency Injection (DI) is a design pattern that allows us to decouple our code from its dependencies. This makes our code more testable, maintainable, and reusable. In the context of our repository pattern, DI allows us to easily swap out the real database connection with a mock or stub database connection during testing.

We already implemented DI in the previous section by injecting the *sql.DB connection into the NewUserRepository constructor. This allows us to pass in a different database connection during testing.

Here’s how we can use DI in our application:


  package main

  import (
      "database/sql"
      "log"

      _ "github.com/lib/pq" // PostgreSQL driver

      "example.com/your-project/repository" // Replace with your actual repository package path
      "example.com/your-project/service" // Replace with your actual service package path
  )

  func main() {
      // Connect to the database
      db, err := sql.Open("postgres", "postgres://user:password@host:port/database?sslmode=disable")
      if err != nil {
          log.Fatal(err)
      }
      defer db.Close()

      // Create a new user repository
      userRepo := repository.NewUserRepository(db)

      // Create a new user service, injecting the repository
      userService := service.NewUserService(userRepo)

      // Use the user service to create a new user
      _, err = userService.CreateUser(context.Background(), domain.CreateUserParams{
          Name:  "John Doe",
          Email: "john.doe@example.com",
      })
      if err != nil {
          log.Fatal(err)
      }

      log.Println("User created successfully!")
  }
  

In this example, we create a new UserService and inject the UserRepository into it. This allows the UserService to interact with the database without knowing the specific implementation details of the UserRepository.

3. Handling Database Transactions

Transactions are essential for maintaining data consistency and integrity. A transaction is a sequence of operations that are treated as a single unit of work. If any operation in the transaction fails, the entire transaction is rolled back, ensuring that the database remains in a consistent state.

To handle transactions in Go with sqlc, we need to use the sql.Tx type. We can start a transaction using the db.Begin() method. Then, we can create a new sqlc.Queries instance using the transaction object. Finally, we can commit or rollback the transaction using the tx.Commit() or tx.Rollback() methods.

Here’s an example of how to handle transactions in our UserRepository:


  func (r *userRepository) CreateUserWithTransaction(ctx context.Context, arg domain.CreateUserParams) (domain.User, error) {
      tx, err := r.db.BeginTx(ctx, nil)
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to begin transaction: %w", err)
      }
      defer tx.Rollback() // Rollback if any error occurs

      queries := sqlc.New(tx)

      result, err := queries.CreateUser(ctx, sqlc.CreateUserParams{
          Name:  arg.Name,
          Email: arg.Email,
      })
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to create user: %w", err)
      }

       id, err := result.LastInsertId()
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to get last insert id: %w", err)
      }

	  dbUser, err := queries.GetUser(ctx, int32(id))
	  if err != nil {
		  return domain.User{}, fmt.Errorf("failed to get created user: %w", err)
	  }

      // ... Perform other database operations within the transaction ...

      err = tx.Commit()
      if err != nil {
          return domain.User{}, fmt.Errorf("failed to commit transaction: %w", err)
      }

      return domain.User{
          ID:        dbUser.ID,
          Name:      dbUser.Name,
          Email:     dbUser.Email,
          CreatedAt: dbUser.CreatedAt,
      }, nil
  }
  

Important considerations for transaction handling:

  • Always defer the tx.Rollback() call immediately after starting the transaction. This ensures that the transaction is rolled back even if an error occurs.
  • Create a new sqlc.Queries instance for each transaction. This ensures that the queries are executed within the context of the transaction.
  • Handle errors carefully and rollback the transaction if any error occurs.
  • Commit the transaction only after all operations have completed successfully.
  • Consider using the `context` to manage the transaction’s lifecycle. You can pass a context with a timeout or cancellation signal to the transaction.

4. Robust Error Handling

Effective error handling is crucial for building reliable and maintainable applications. In our data access layer, we need to handle errors from the database and return them to the caller in a meaningful way. We also need to define custom error types to represent specific error conditions.

Here are some best practices for error handling in our repository:

  • Wrap errors: Use fmt.Errorf("%w", err) to wrap errors from the database. This preserves the original error and adds context to it. This makes it easier to debug errors and understand their root cause.
  • Define custom errors: Define custom error types to represent specific error conditions, such as ErrNotFound. This allows the caller to easily check for specific error conditions and handle them accordingly.
  • Use sentinel errors or error types: Choose between sentinel errors (e.g., `var ErrNotFound = errors.New(“not found”)`) or custom error types based on the complexity and information you need to convey with the error. Error types allow you to attach more data to the error, which can be useful for logging and debugging.
  • Log errors: Log errors with sufficient detail to help with debugging. Include the error message, the stack trace, and any relevant context information.
  • Avoid panics: Avoid using panic for error handling. Panics should be reserved for exceptional situations that are unrecoverable.
  • Use a dedicated error handling library: Consider using a dedicated error handling library, such as github.com/pkg/errors, to simplify error wrapping and unwrapping.

Here’s an example of how to define a custom error type:


  package domain

  import "errors"

  var ErrNotFound = errors.New("not found")
  

And here’s how to use it in our GetUser method:


  func (r *userRepository) GetUser(ctx context.Context, id int32) (domain.User, error) {
      user, err := r.queries.GetUser(ctx, id)
      if err != nil {
          if err == sql.ErrNoRows {
              return domain.User{}, domain.ErrNotFound
          }
          return domain.User{}, fmt.Errorf("failed to get user: %w", err)
      }

      return domain.User{
          ID:        user.ID,
          Name:      user.Name,
          Email:     user.Email,
          CreatedAt: user.CreatedAt,
      }, nil
  }
  

5. Writing Unit Tests

Unit tests are essential for ensuring that our code functions correctly and for preventing regressions. In our data access layer, we need to write unit tests to verify that our repositories are interacting with the database correctly.

Here are some best practices for writing unit tests for our repositories:

  • Use a testing framework: Use a testing framework, such as testing, to write and run our unit tests.
  • Mock dependencies: Mock out any dependencies that are not part of the unit being tested. In our case, we need to mock out the database connection to avoid interacting with a real database during testing.
  • Use testify: Leverage assertion libraries like `testify` for more readable and powerful assertions.
  • Test happy paths and error paths: Test both the happy paths (when everything goes right) and the error paths (when something goes wrong).
  • Write clear and concise tests: Write tests that are easy to understand and maintain. Each test should focus on testing a single aspect of the code.
  • Follow the Arrange-Act-Assert pattern: Structure your tests using the Arrange-Act-Assert pattern to improve readability.
  • Use table-driven tests: For testing functions with multiple inputs and outputs, use table-driven tests to avoid code duplication.
  • Clean up after tests: Clean up any resources that were created during the test, such as temporary database records.

Here’s an example of how to write a unit test for our GetUser method:


  package repository_test

  import (
      "context"
      "database/sql"
      "errors"
      "regexp"
      "testing"
      "time"

      "github.com/DATA-DOG/go-sqlmock"
      "github.com/stretchr/testify/assert"
      "github.com/stretchr/testify/require"

      "example.com/your-project/db/sqlc" // Replace with your actual sqlc package path
      "example.com/your-project/domain" // Replace with your actual domain package path
      "example.com/your-project/repository" // Replace with your actual repository package path
  )

  func TestGetUser(t *testing.T) {
      db, mock, err := sqlmock.New()
      require.NoError(t, err)
      defer db.Close()

      queries := sqlc.New(db)
      repo := repository.NewUserRepository(db)

      testCases := []struct {
          name          string
          userID        int32
          mockBehavior  func(sqlmock.Sqlmock, int32)
          expectedUser  domain.User
          expectedError error
      }{
          {
              name:   "Success",
              userID: 1,
              mockBehavior: func(mock sqlmock.Sqlmock, userID int32) {
                  rows := sqlmock.NewRows([]string{"id", "name", "email", "created_at"}).
                      AddRow(userID, "John Doe", "john.doe@example.com", time.Now())

                  mock.ExpectQuery(regexp.QuoteMeta("SELECT id, name, email, created_at FROM users WHERE id = $1")).
                      WithArgs(userID).
                      WillReturnRows(rows)
              },
              expectedUser: domain.User{
                  ID:    1,
                  Name:  "John Doe",
                  Email: "john.doe@example.com",
              },
              expectedError: nil,
          },
          {
              name:   "NotFound",
              userID: 2,
              mockBehavior: func(mock sqlmock.Sqlmock, userID int32) {
                  mock.ExpectQuery(regexp.QuoteMeta("SELECT id, name, email, created_at FROM users WHERE id = $1")).
                      WithArgs(userID).
                      WillReturnError(sql.ErrNoRows)
              },
              expectedUser:  domain.User{},
              expectedError: domain.ErrNotFound,
          },
          {
              name:   "InternalError",
              userID: 3,
              mockBehavior: func(mock sqlmock.Sqlmock, userID int32) {
                  mock.ExpectQuery(regexp.QuoteMeta("SELECT id, name, email, created_at FROM users WHERE id = $1")).
                      WithArgs(userID).
                      WillReturnError(errors.New("database error"))
              },
              expectedUser:  domain.User{},
              expectedError: errors.New("failed to get user: database error"),
          },
      }

      for _, tc := range testCases {
          t.Run(tc.name, func(t *testing.T) {
              tc.mockBehavior(mock, tc.userID)

              user, err := repo.GetUser(context.Background(), tc.userID)

              assert.Equal(t, tc.expectedUser.ID, user.ID)
              assert.Equal(t, tc.expectedUser.Name, user.Name)
              assert.Equal(t, tc.expectedUser.Email, user.Email)
              if tc.expectedError != nil {
                  assert.EqualError(t, err, tc.expectedError.Error())
              } else {
                  assert.NoError(t, err)
              }

              require.NoError(t, mock.ExpectationsWereMet())
          })
      }

  }
  

Explanation:

  • We use the go-sqlmock library to mock the database connection. This allows us to simulate different database scenarios without interacting with a real database.
  • We define a mockBehavior function that sets up the expectations for the mock database connection. This tells the mock database connection what to return when a specific SQL query is executed.
  • We use the assert package from testify to assert that the actual results match the expected results.
  • We use `require.NoError` to immediately fail the test if setting up the mock fails.
  • We use `regexp.QuoteMeta` to escape the SQL query string when defining expectations for the mock.
  • We verify that all expectations were met using mock.ExpectationsWereMet().
  • We are using table driven tests to test multiple scenarios in a single test function.

6. Implementing Pagination

Pagination is a common requirement for APIs that return large datasets. It allows clients to retrieve data in smaller chunks, improving performance and user experience.

To implement pagination in our repository, we need to add two parameters to our list methods: limit and offset. The limit parameter specifies the maximum number of records to return, and the offset parameter specifies the starting point for the query.

Here’s how we can modify our ListUsers method to support pagination:


  package domain

  type ListUsersParams struct {
      Limit  int32
      Offset int32
  }
  

  package repository

  func (r *userRepository) ListUsers(ctx context.Context, arg domain.ListUsersParams) ([]domain.User, error) {
      users, err := r.queries.ListUsers(ctx, sqlc.ListUsersParams{
          Limit:  arg.Limit,
          Offset: arg.Offset,
      })
      if err != nil {
          return nil, fmt.Errorf("failed to list users: %w", err)
      }

      domainUsers := make([]domain.User, len(users))
      for i, user := range users {
          domainUsers[i] = domain.User{
              ID:        user.ID,
              Name:      user.Name,
              Email:     user.Email,
              CreatedAt: user.CreatedAt,
          }
      }

      return domainUsers, nil
  }
  

And here’s how to update our sqlc query:


  -- name: ListUsers :many
  SELECT * FROM users
  ORDER BY id
  LIMIT $1
  OFFSET $2;
  

Key considerations for pagination:

  • Validation: Validate the limit and offset parameters to prevent abuse. For example, you can limit the maximum value of limit.
  • Default values: Provide default values for limit and offset.
  • Total count: Return the total number of records in the dataset. This allows the client to calculate the total number of pages. You can achieve this by executing a separate query that counts the total number of records.
  • Cursor-based pagination: Consider using cursor-based pagination for large datasets. Cursor-based pagination is more efficient than offset-based pagination because it avoids scanning the entire dataset.
  • Order by: Always specify an `ORDER BY` clause to ensure consistent pagination results.

7. Common Challenges and Solutions

While the Repository pattern with sqlc offers numerous benefits, you might encounter some challenges during implementation. Here’s a breakdown of common hurdles and their solutions:

  • Challenge: Complex Queries: Handling complex SQL queries with multiple joins, subqueries, or conditional logic can be challenging with sqlc. The generated code might become difficult to manage.

    Solution: Break down complex queries into smaller, more manageable queries. Use database views or stored procedures to encapsulate complex logic. Consider using CTEs (Common Table Expressions) within your SQL queries to improve readability and maintainability. If the complexity is unavoidable, carefully document the query and consider adding integration tests to ensure its correctness.

  • Challenge: Database Schema Changes: Changes to the database schema require regenerating the sqlc code. This can be disruptive if the changes are frequent.

    Solution: Use a database migration tool, such as golang-migrate/migrate or liquibase, to manage database schema changes. Automate the sqlc code generation process as part of your CI/CD pipeline. This ensures that the generated code is always up-to-date. Carefully plan schema changes to minimize disruption. Use backward-compatible changes whenever possible.

  • Challenge: Mapping Database Types to Go Types: sqlc handles most common database types, but you might encounter situations where you need to customize the mapping between database types and Go types.

    Solution: sqlc allows you to define custom type overrides in the sqlc.yaml configuration file. This allows you to specify how specific database types should be mapped to Go types. You can also use custom scanner/value implementations for more complex type conversions.

  • Challenge: Performance Bottlenecks: Poorly written SQL queries can lead to performance bottlenecks. Identifying and resolving these bottlenecks can be challenging.

    Solution: Use database profiling tools to identify slow queries. Optimize your SQL queries by adding indexes, rewriting complex queries, and using appropriate data types. Consider using caching to reduce the number of database queries. Analyze query execution plans to identify areas for improvement. Regularly review and optimize your SQL queries.

  • Challenge: Testing with Transactions: Testing code that uses transactions requires careful setup and teardown to ensure that the tests are isolated and consistent.

    Solution: Use mock database connections for unit tests. For integration tests, use a dedicated test database and ensure that each test runs in a separate transaction that is rolled back after the test completes. Implement helper functions to manage transaction setup and teardown. Consider using the `testfixtures` library to load test data into the database before each test.

8. Performance Optimization Strategies

Optimizing the performance of your data access layer is crucial for building responsive and scalable applications. Here are several strategies to consider:

  • Indexing: Properly indexing your database tables is the most fundamental optimization. Indexes allow the database to quickly locate rows that match a specific query, avoiding a full table scan.

    Action: Analyze your SQL queries and identify the columns that are frequently used in WHERE clauses, JOIN conditions, and ORDER BY clauses. Create indexes on these columns.

  • Query Optimization: Writing efficient SQL queries is essential for performance. Avoid using SELECT * and instead select only the columns that are needed. Use appropriate JOIN types (e.g., INNER JOIN, LEFT JOIN) based on your requirements. Avoid using OR conditions in WHERE clauses, as they can prevent the database from using indexes effectively. Consider rewriting complex queries using CTEs or subqueries to improve readability and performance.

    Action: Use database profiling tools to identify slow queries. Analyze the query execution plans to identify areas for improvement. Experiment with different query formulations to find the most efficient one.

  • Connection Pooling: Creating a new database connection for each request is expensive. Connection pooling allows you to reuse existing connections, reducing the overhead of connection establishment.

    Action: Use a connection pooling library, such as github.com/jackc/pgx/v5/pgxpool (for PostgreSQL) or the built-in database/sql connection pooling mechanism, to manage your database connections. Configure the connection pool with appropriate settings for maximum connections, idle connections, and connection timeouts.

  • Caching: Caching frequently accessed data can significantly improve performance by reducing the number of database queries.

    Action: Implement caching at different layers of your application. Use a local in-memory cache (e.g., sync.Map or github.com/patrickmn/go-cache) for frequently accessed data that doesn’t change often. Use a distributed cache (e.g., Redis or Memcached) for data that needs to be shared across multiple servers. Use database query caching to cache the results of expensive queries.

  • Batching: Batching multiple database operations into a single request can reduce network overhead and improve performance.

    Action: Use batching APIs provided by your database driver to perform multiple INSERT, UPDATE, or DELETE operations in a single request. Consider using a library like github.com/keegancsmith/sqlf to construct complex SQL queries with placeholders for batching.

  • Read Replicas: If your application is read-heavy, consider using read replicas to offload read traffic from the primary database.

    Action: Configure read replicas for your database. Route read queries to the read replicas and write queries to the primary database. Ensure that the read replicas are kept in sync with the primary database using replication.

  • Profiling: Regularly profile your application to identify performance bottlenecks in your data access layer. Use profiling tools, such as pprof, to collect performance data and identify slow queries or inefficient code.

    Action: Set up profiling for your application. Analyze the profiling data to identify performance bottlenecks. Optimize your code and SQL queries based on the profiling results.

9. Conclusion

In this two-part series, we’ve explored how to build a clean, performant, and testable data access layer in Go using the Repository pattern and sqlc. We’ve covered the benefits of this approach, including improved code organization, testability, and performance. We’ve also discussed common challenges and best practices for working with repositories and sqlc in Go.

By following the principles and techniques outlined in this series, you can build a robust and maintainable data access layer that will serve as a solid foundation for your Go applications. Remember to prioritize code clarity, testability, and performance when designing and implementing your data access layer.

The combination of the Repository pattern and sqlc offers a powerful way to interact with databases in Go, allowing you to write cleaner, more testable, and more efficient code. By mastering these techniques, you’ll be well-equipped to build high-quality Go applications that can handle the demands of real-world workloads.

Further exploration includes looking into more advanced sqlc features, implementing caching strategies more deeply, and exploring different database technologies and their Go drivers. The world of data access in Go is constantly evolving, so continuous learning is key!

“`

omcoding

Leave a Reply

Your email address will not be published. Required fields are marked *