Compiler Plugins and Macro Programming: Extending Scala's Capabilities

Scala's extensible architecture allows developers to extend the language and compiler through plugins and macros. This meta-programming capability enables the creation of domain-specific languages, code generators, and powerful development tools. In this comprehensive lesson, we'll explore how to build compiler plugins, write macros, and leverage Scala's meta-programming features.

Understanding Scala's Compilation Process

Before diving into meta-programming, it's essential to understand how Scala compilation works.

Compilation Phases

// The Scala compiler goes through several phases:
// 1. Parse: Source code → AST (Abstract Syntax Tree)
// 2. Namer: Create symbols and binding
// 3. PackageObjects: Handle package objects
// 4. Typer: Type checking and inference
// 5. SuperAccessors: Generate super accessors
// 6. ExtMethods: Extension methods handling
// 7. Pickler: Serialize symbols
// 8. RefChecks: Reference checking
// 9. Uncurry: Eliminate currying
// 10. TailCalls: Optimize tail calls
// 11. Specialization: Specialize generic classes
// 12. ExplicitOuter: Make outer references explicit
// 13. Erasure: Type erasure
// 14. PostErasure: Post-erasure cleanup
// 15. Lambdalift: Lambda lifting
// 16. Constructors: Constructor handling
// 17. Flatten: Flatten nested classes
// 18. Mixin: Mixin composition
// 19. Cleanup: Final cleanup
// 20. GenBCode: Generate bytecode

// Understanding AST structure
import scala.meta._

// Example: parsing Scala code into AST
val source = "val x = 42 + 10"
val tree = source.parse[Stat].get

tree match {
  case q"val $name = $expr" =>
    println(s"Variable: $name")
    println(s"Expression: $expr")
}

// AST manipulation example
def incrementLiterals(tree: Tree): Tree = tree transform {
  case q"${Lit.Int(n)}" => q"${Lit.Int(n + 1)}"
}

val originalCode = q"val x = 5 + 3"
val incrementedCode = incrementLiterals(originalCode)
println(incrementedCode) // val x = 6 + 4

Scala Meta: Modern Meta-Programming

Scala Meta provides a clean, modern API for meta-programming in Scala 3.

Tree Transformation and Analysis

import scala.meta._

// Tree pattern matching
object TreeAnalyzer {
  def analyzeFunction(tree: Tree): String = tree match {
    case q"def $name(...$paramss): $tpe = $body" =>
      s"Function $name with ${paramss.flatten.length} parameters returns $tpe"

    case q"val $name: $tpe = $rhs" =>
      s"Value $name of type $tpe initialized with $rhs"

    case q"var $name: $tpe = $rhs" =>
      s"Variable $name of type $tpe initialized with $rhs"

    case q"class $name[..$tparams](...$paramss) extends ..$parents { ..$stats }" =>
      s"Class $name with ${tparams.length} type parameters and ${stats.length} members"

    case _ => "Unknown construct"
  }

  // Find all method calls in a tree
  def findMethodCalls(tree: Tree): List[Term.Apply] = {
    tree.collect {
      case call @ q"$expr(...$argss)" => call
    }
  }

  // Find all variable declarations
  def findVariables(tree: Tree): List[Defn.Val] = {
    tree.collect {
      case valDef @ q"val ..$pats = $expr" => valDef
    }
  }

  // Count lines of code (excluding comments and empty lines)
  def countLOC(tree: Tree): Int = {
    tree.collect {
      case _: Stat => 1
    }.sum
  }
}

// Code generation with Scala Meta
object CodeGenerator {

  // Generate getter and setter methods
  def generateAccessors(fieldName: String, fieldType: String): List[Stat] = {
    val getter = q"def ${Term.Name(fieldName)}: ${Type.Name(fieldType)} = _${Term.Name(fieldName)}"
    val setter = q"def ${Term.Name(s"${fieldName}_=")}(value: ${Type.Name(fieldType)}): Unit = _${Term.Name(fieldName)} = value"
    List(getter, setter)
  }

  // Generate case class from field definitions
  def generateCaseClass(className: String, fields: List[(String, String)]): Defn.Class = {
    val params = fields.map { case (name, tpe) =>
      param"$name: ${Type.Name(tpe)}"
    }

    q"case class ${Type.Name(className)}(..$params)"
  }

  // Generate apply method for companion object
  def generateApplyMethod(className: String, fields: List[(String, String)]): Defn.Def = {
    val params = fields.map { case (name, tpe) =>
      param"$name: ${Type.Name(tpe)}"
    }
    val args = fields.map { case (name, _) => Term.Name(name) }

    q"def apply(..$params): ${Type.Name(className)} = new ${Type.Name(className)}(..$args)"
  }

  // Generate validation methods
  def generateValidator(className: String, validations: List[(String, String)]): Defn.Def = {
    val validationCalls = validations.map { case (field, validation) =>
      q"validate${Term.Name(field.capitalize)}()"
    }

    q"""
    def validate(): Either[List[String], ${Type.Name(className)}] = {
      val errors = List(..$validationCalls).flatten
      if (errors.isEmpty) Right(this) else Left(errors)
    }
    """
  }
}

// Tree transformation utilities
object TreeTransformers {

  // Replace all occurrences of a specific identifier
  def replaceIdentifier(tree: Tree, oldName: String, newName: String): Tree = {
    tree.transform {
      case name @ Term.Name(`oldName`) => Term.Name(newName)
      case name @ Type.Name(`oldName`) => Type.Name(newName)
    }
  }

  // Add logging to all method calls
  def addLogging(tree: Tree): Tree = {
    tree.transform {
      case q"$expr(...$argss)" =>
        q"""
        {
          println(s"Calling method: ${$expr}")
          $expr(...$argss)
        }
        """
    }
  }

  // Convert mutable variables to immutable values
  def makeImmutable(tree: Tree): Tree = {
    tree.transform {
      case q"var $name: $tpe = $rhs" => q"val $name: $tpe = $rhs"
    }
  }

  // Add null checks to method parameters
  def addNullChecks(tree: Tree): Tree = {
    tree.transform {
      case q"def $name(...$paramss): $tpe = $body" =>
        val nullChecks = paramss.flatten.collect {
          case param"$name: $tpe" if !tpe.toString.contains("Option") =>
            q"require($name != null, ${Lit.String(s"$name cannot be null")})"
        }

        q"def $name(...$paramss): $tpe = { ..$nullChecks; $body }"
    }
  }
}

Custom Syntax Extensions

import scala.meta._

// Custom DSL for database queries
object DatabaseDSL {

  // Define custom syntax for SQL-like queries
  implicit class QuerySyntax(sc: StringContext) {
    def sql(args: Any*): DatabaseQuery = {
      val query = sc.parts.zip(args).map { case (part, arg) =>
        part + arg.toString
      }.mkString + sc.parts.last

      DatabaseQuery(query)
    }
  }

  case class DatabaseQuery(query: String) {
    def where(condition: String): DatabaseQuery = 
      copy(query = s"$query WHERE $condition")

    def orderBy(field: String): DatabaseQuery = 
      copy(query = s"$query ORDER BY $field")

    def limit(n: Int): DatabaseQuery = 
      copy(query = s"$query LIMIT $n")
  }

  // Custom operators for fluent APIs
  case class FluentBuilder(operations: List[String] = List.empty) {
    def ~>(operation: String): FluentBuilder = 
      copy(operations = operations :+ operation)

    def |>(finalOperation: String): String = 
      (operations :+ finalOperation).mkString(" -> ")
  }

  implicit def stringToBuilder(s: String): FluentBuilder = FluentBuilder(List(s))

  // Usage:
  // val pipeline = "input" ~> "transform" ~> "validate" |> "output"
}

// Custom control structures using macros
object ControlStructures {

  // Retry mechanism with exponential backoff
  def retry[T](maxAttempts: Int, baseDelay: Int = 100)(operation: => T): T = {
    var attempts = 0
    var lastException: Throwable = null

    while (attempts < maxAttempts) {
      try {
        return operation
      } catch {
        case e: Exception =>
          lastException = e
          attempts += 1
          if (attempts < maxAttempts) {
            Thread.sleep(baseDelay * math.pow(2, attempts - 1).toLong)
          }
      }
    }

    throw new RuntimeException(s"Operation failed after $maxAttempts attempts", lastException)
  }

  // Timing wrapper
  def timed[T](operation: => T): (T, Long) = {
    val start = System.nanoTime()
    val result = operation
    val duration = System.nanoTime() - start
    (result, duration)
  }

  // Safe division with custom error handling
  def safeDivide(numerator: Double, denominator: Double): Either[String, Double] = {
    if (denominator == 0) {
      Left("Division by zero")
    } else if (denominator.isInfinite || denominator.isNaN) {
      Left("Invalid denominator")
    } else {
      Right(numerator / denominator)
    }
  }

  // Pattern-based conditional execution
  def when[T](condition: Boolean)(action: => T): Option[T] = {
    if (condition) Some(action) else None
  }

  def unless[T](condition: Boolean)(action: => T): Option[T] = {
    when(!condition)(action)
  }
}

// Custom collection operations
object CollectionExtensions {

  implicit class RichList[T](list: List[T]) {
    def chunksOf(size: Int): List[List[T]] = {
      list.grouped(size).toList
    }

    def rotateLeft(n: Int): List[T] = {
      val normalizedN = n % list.length
      list.drop(normalizedN) ++ list.take(normalizedN)
    }

    def rotateRight(n: Int): List[T] = {
      rotateLeft(-n)
    }

    def intersperse(separator: T): List[T] = {
      list match {
        case Nil => Nil
        case head :: Nil => List(head)
        case head :: tail => head :: separator :: tail.intersperse(separator)
      }
    }

    def zipWithIndex3: List[(T, Int)] = {
      list.zipWithIndex.map { case (elem, idx) => (elem, idx + 1) }
    }
  }

  implicit class RichMap[K, V](map: Map[K, V]) {
    def mergeWith(other: Map[K, V])(combine: (V, V) => V): Map[K, V] = {
      (map.keySet ++ other.keySet).map { key =>
        (map.get(key), other.get(key)) match {
          case (Some(v1), Some(v2)) => key -> combine(v1, v2)
          case (Some(v1), None) => key -> v1
          case (None, Some(v2)) => key -> v2
          case (None, None) => throw new IllegalStateException("Impossible case")
        }
      }.toMap
    }

    def filterByValue(predicate: V => Boolean): Map[K, V] = {
      map.filter { case (_, value) => predicate(value) }
    }

    def mapValues2[W](f: V => W): Map[K, W] = {
      map.map { case (k, v) => k -> f(v) }
    }
  }
}

Inline Functions and Compile-Time Programming

Scala 3 introduces inline for compile-time evaluation and meta-programming.

Inline Functions and Transparent Inline

// Basic inline functions
inline def square(x: Int): Int = x * x

// The compiler will replace square(5) with 25 at compile time
val result = square(5) // Becomes: val result = 25

// Inline with type parameters
inline def cast[T](value: Any): T = value.asInstanceOf[T]

// Transparent inline - return type depends on argument
transparent inline def valueOf[T]: Any = 
  inline erasedValue[T] match {
    case _: String => "string"
    case _: Int => 42
    case _: Boolean => true
    case _ => null
  }

val stringValue: String = valueOf[String] // Type is String, value is "string"
val intValue: Int = valueOf[Int] // Type is Int, value is 42

// Compile-time conditionals
inline def platformSpecific: String = {
  inline if (scala.util.Properties.isWin) "Windows"
  else inline if (scala.util.Properties.isMac) "macOS"
  else "Linux"
}

// Inline recursion for compile-time computation
inline def factorial(n: Int): Int = {
  inline if (n <= 1) 1
  else n * factorial(n - 1)
}

val fact5 = factorial(5) // Computed at compile time: 120

// Compile-time string interpolation
inline def compile_time_format(inline pattern: String, args: Any*): String = {
  // Custom formatting logic that runs at compile time
  pattern.replace("{}", args.mkString(", "))
}

val message = compile_time_format("Hello {}, you have {} messages", "Alice", 3)

Compile-Time Error Reporting

import scala.compiletime._

// Custom compile-time error messages
inline def requirePositive(inline value: Int): Int = {
  inline if (value <= 0) {
    error("Value must be positive, got: " + value.toString)
  } else value
}

// Usage:
val positive = requirePositive(5) // OK
// val negative = requirePositive(-3) // Compile-time error!

// Compile-time type checking with custom messages
inline def requireSubtype[Sub, Super](value: Sub): Sub = {
  inline if (!summonFrom {
    case _: (Sub <:< Super) => true
    case _ => false
  }) {
    error("Type " + constValue[Sub] + " is not a subtype of " + constValue[Super])
  } else value
}

// Compile-time assertion with custom predicates
inline def staticAssert(inline condition: Boolean, inline message: String): Unit = {
  inline if (!condition) {
    error(message)
  }
}

// Usage in type-level programming
trait NonEmpty[T]
object NonEmpty {
  inline given stringNonEmpty: NonEmpty[String] = new NonEmpty[String] {}
  inline given listNonEmpty[A]: NonEmpty[List[A]] = new NonEmpty[List[A]] {}
}

inline def requireNonEmpty[T](value: T)(using inline evidence: NonEmpty[T]): T = {
  staticAssert(
    summonInline[NonEmpty[T]] != null,
    "Type " + constValue[T] + " does not have NonEmpty evidence"
  )
  value
}

// Compile-time validation of string literals
inline def validateEmail(inline email: String): String = {
  inline if (!email.contains("@")) {
    error("Invalid email format: " + email)
  } else email
}

val validEmail = validateEmail("user@example.com") // OK
// val invalidEmail = validateEmail("invalid-email") // Compile-time error!

// Compile-time regex validation
inline def validatePattern(inline pattern: String, inline input: String): String = {
  inline if (!java.util.regex.Pattern.compile(pattern).matcher(input).matches()) {
    error(s"Input '$input' does not match pattern '$pattern'")
  } else input
}

Compile-Time Code Generation

import scala.compiletime._
import scala.deriving._

// Automatic type class derivation
trait Show[T] {
  def show(value: T): String
}

object Show {
  given Show[String] = identity
  given Show[Int] = _.toString
  given Show[Boolean] = _.toString

  // Automatic derivation for case classes
  inline given derived[T](using m: Mirror.Of[T]): Show[T] = {
    inline m match {
      case s: Mirror.SumOf[T] => showSum(s)
      case p: Mirror.ProductOf[T] => showProduct(p)
    }
  }

  inline def showProduct[T](p: Mirror.ProductOf[T]): Show[T] = {
    new Show[T] {
      def show(value: T): String = {
        val label = constValue[p.MirroredLabel]
        val elemShows = summonAll[Tuple.Map[p.MirroredElemTypes, Show]]
        val elemLabels = constValueTuple[p.MirroredElemLabels]

        val values = Tuple.fromProductTyped(value)
        val fields = showElements(values, elemShows, elemLabels)
        s"$label($fields)"
      }
    }
  }

  inline def showSum[T](s: Mirror.SumOf[T]): Show[T] = {
    new Show[T] {
      def show(value: T): String = {
        val ord = s.ordinal(value)
        val elemShows = summonAll[Tuple.Map[s.MirroredElemTypes, Show]]
        val elemLabels = constValueTuple[s.MirroredElemLabels]

        // Get the appropriate show instance for the specific case
        showElement(value, elemShows, elemLabels, ord)
      }
    }
  }

  inline def summonAll[T <: Tuple]: Tuple.Map[T, Show] = {
    inline erasedValue[T] match {
      case _: EmptyTuple => EmptyTuple
      case _: (t *: ts) => summonInline[Show[t]] *: summonAll[ts]
    }
  }

  def showElements(values: Tuple, shows: Tuple, labels: Tuple): String = {
    // Implementation to show each element
    "..." // Simplified
  }

  def showElement(value: Any, shows: Tuple, labels: Tuple, ord: Int): String = {
    // Implementation to show specific sum type element
    "..." // Simplified
  }

  def constValueTuple[T <: Tuple]: Tuple = {
    // Extract compile-time constant values
    EmptyTuple // Simplified
  }
}

// Usage of automatic derivation
case class Person(name: String, age: Int, isStudent: Boolean)
case class Address(street: String, city: String, zipCode: String)

given Show[Person] = Show.derived
given Show[Address] = Show.derived

val person = Person("Alice", 25, false)
println(summon[Show[Person]].show(person)) // Person(Alice, 25, false)

// Compile-time configuration validation
case class DatabaseConfig(
  host: String,
  port: Int,
  database: String,
  ssl: Boolean
)

inline def validateDatabaseConfig(inline config: DatabaseConfig): DatabaseConfig = {
  staticAssert(config.port > 0 && config.port <= 65535, "Port must be between 1 and 65535")
  staticAssert(config.host.nonEmpty, "Host cannot be empty")
  staticAssert(config.database.nonEmpty, "Database name cannot be empty")
  config
}

// This validates at compile time
val validConfig = validateDatabaseConfig(DatabaseConfig("localhost", 5432, "mydb", true))
// val invalidConfig = validateDatabaseConfig(DatabaseConfig("", 70000, "", false)) // Error!

// Compile-time JSON validation
case class ApiResponse(status: String, data: Map[String, Any])

inline def validateApiResponse(inline response: ApiResponse): ApiResponse = {
  staticAssert(
    response.status == "success" || response.status == "error",
    "Status must be 'success' or 'error'"
  )
  response
}

// Compile-time performance optimizations
object PerformanceOptimizations {

  // Unroll loops at compile time
  inline def unrolledLoop[T](inline count: Int)(inline body: Int => T): List[T] = {
    inline if (count <= 0) Nil
    else {
      val results = scala.collection.mutable.ListBuffer[T]()
      compiletime.constValueTuple[Tuple.Map[Tuple.Map[scala.Tuple.Range[0, count], [X] =>> Int], [X] =>> Int]]

      // Simplified unrolling - in practice this would generate inline code
      (0 until count).map(body).toList
    }
  }

  // Compile-time lookup tables
  inline def precomputedTable[T](inline f: Int => T)(inline size: Int): Array[T] = {
    val table = new Array[Any](size).asInstanceOf[Array[T]]
    var i = 0
    while (i < size) {
      table(i) = f(i)
      i += 1
    }
    table
  }

  // Fast modular arithmetic with compile-time constants
  inline def fastMod(value: Int, inline modulus: Int): Int = {
    inline if (isPowerOfTwo(modulus)) {
      value & (modulus - 1)
    } else {
      value % modulus
    }
  }

  inline def isPowerOfTwo(inline n: Int): Boolean = {
    n > 0 && (n & (n - 1)) == 0
  }
}

Building Compiler Plugins

Compiler plugins allow you to extend the Scala compiler with custom transformations and analysis.

Creating a Simple Compiler Plugin

// Plugin definition (in separate project/module)
import scala.tools.nsc
import nsc.Global
import nsc.Phase
import nsc.plugins.Plugin
import nsc.plugins.PluginComponent

class LoggingPlugin(val global: Global) extends Plugin {
  import global._

  val name = "logging-plugin"
  val description = "Adds automatic logging to methods"
  val components = List[PluginComponent](Component)

  private object Component extends PluginComponent {
    val global: LoggingPlugin.this.global.type = LoggingPlugin.this.global
    val runsAfter = List[String]("typer")
    val phaseName = LoggingPlugin.this.name

    def newPhase(_prev: Phase) = new LoggingPhase(_prev)

    class LoggingPhase(prev: Phase) extends StdPhase(prev) {
      override def name = LoggingPlugin.this.name

      def apply(unit: CompilationUnit): Unit = {
        unit.body = transformTree(unit.body)
      }

      def transformTree(tree: Tree): Tree = {
        val transformer = new LoggingTransformer
        transformer.transform(tree)
      }
    }
  }

  class LoggingTransformer extends Transformer {
    override def transform(tree: Tree): Tree = tree match {
      case dd @ DefDef(mods, name, tparams, vparamss, tpt, rhs) 
        if !mods.hasFlag(Flag.ACCESSOR) && !mods.hasFlag(Flag.SYNTHETIC) =>

        val logStatement = q"""println("Entering method: " + ${name.toString})"""
        val newRhs = q"""{ $logStatement; ${transform(rhs)} }"""

        treeCopy.DefDef(dd, mods, name, tparams, vparamss, tpt, newRhs)

      case _ => super.transform(tree)
    }
  }
}

// Plugin configuration (scalac-plugin.xml)
/*
<plugin>
  <name>logging-plugin</name>
  <classname>com.example.LoggingPlugin</classname>
</plugin>
*/

// Build configuration (build.sbt)
/*
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value

// To use the plugin
scalacOptions += "-Xplugin:path/to/logging-plugin.jar"
*/

Advanced Compiler Plugin Features

// Performance analysis plugin
class PerformanceAnalysisPlugin(val global: Global) extends Plugin {
  import global._

  val name = "performance-analysis"
  val description = "Analyzes code for performance issues"
  val components = List[PluginComponent](AnalysisComponent)

  private object AnalysisComponent extends PluginComponent {
    val global: PerformanceAnalysisPlugin.this.global.type = PerformanceAnalysisPlugin.this.global
    val runsAfter = List[String]("typer")
    val phaseName = PerformanceAnalysisPlugin.this.name

    def newPhase(_prev: Phase) = new AnalysisPhase(_prev)

    class AnalysisPhase(prev: Phase) extends StdPhase(prev) {
      override def name = PerformanceAnalysisPlugin.this.name

      def apply(unit: CompilationUnit): Unit = {
        val analyzer = new PerformanceAnalyzer
        analyzer.traverse(unit.body)
      }
    }
  }

  class PerformanceAnalyzer extends Traverser {
    var warnings = List.empty[String]

    override def traverse(tree: Tree): Unit = tree match {
      // Detect inefficient string concatenation
      case Apply(Select(lhs, TermName("$plus")), List(rhs)) 
        if lhs.tpe <:< typeOf[String] =>
        warnings ::= s"Inefficient string concatenation at ${tree.pos}: consider using StringBuilder"
        super.traverse(tree)

      // Detect nested loops that might be inefficient
      case outer @ Apply(Select(collection1, TermName("foreach" | "map" | "filter")), _) =>
        tree.collect {
          case inner @ Apply(Select(collection2, TermName("foreach" | "map" | "filter")), _) 
            if inner != outer =>
            warnings ::= s"Nested collection operations at ${inner.pos}: consider flattening or using for-comprehension"
        }
        super.traverse(tree)

      // Detect synchronization in tight loops
      case Apply(Select(_, TermName("synchronized")), _) =>
        findEnclosingLoop(tree) match {
          case Some(loopPos) =>
            warnings ::= s"Synchronization inside loop at ${tree.pos}: consider moving synchronization outside loop"
          case None =>
        }
        super.traverse(tree)

      // Detect large object creation in loops
      case New(_) =>
        findEnclosingLoop(tree) match {
          case Some(loopPos) =>
            warnings ::= s"Object creation inside loop at ${tree.pos}: consider object pooling or moving outside loop"
          case None =>
        }
        super.traverse(tree)

      case _ => super.traverse(tree)
    }

    def findEnclosingLoop(tree: Tree): Option[Position] = {
      // Simplified implementation
      None
    }

    override def apply(tree: Tree): Unit = {
      super.apply(tree)
      if (warnings.nonEmpty) {
        warnings.reverse.foreach(warning => global.reporter.warning(tree.pos, warning))
      }
    }
  }
}

// Code generation plugin
class CodeGenerationPlugin(val global: Global) extends Plugin {
  import global._

  val name = "code-generation"
  val description = "Generates boilerplate code based on annotations"
  val components = List[PluginComponent](GenerationComponent)

  private object GenerationComponent extends PluginComponent {
    val global: CodeGenerationPlugin.this.global.type = CodeGenerationPlugin.this.global
    val runsAfter = List[String]("typer")
    val phaseName = CodeGenerationPlugin.this.name

    def newPhase(_prev: Phase) = new GenerationPhase(_prev)

    class GenerationPhase(prev: Phase) extends StdPhase(prev) {
      override def name = CodeGenerationPlugin.this.name

      def apply(unit: CompilationUnit): Unit = {
        unit.body = new CodeGenerator().transform(unit.body)
      }
    }
  }

  class CodeGenerator extends Transformer {
    override def transform(tree: Tree): Tree = tree match {
      // Generate toString method for classes annotated with @ToString
      case cd @ ClassDef(mods, name, tparams, impl) if hasToStringAnnotation(mods) =>
        val toStringMethod = generateToStringMethod(cd)
        val newTemplate = deriveTemplate(impl) {
          case body => body :+ toStringMethod
        }
        treeCopy.ClassDef(cd, mods, name, tparams, newTemplate)

      // Generate equals and hashCode for classes annotated with @EqualsHashCode
      case cd @ ClassDef(mods, name, tparams, impl) if hasEqualsHashCodeAnnotation(mods) =>
        val equalsMethod = generateEqualsMethod(cd)
        val hashCodeMethod = generateHashCodeMethod(cd)
        val newTemplate = deriveTemplate(impl) {
          case body => body ++ List(equalsMethod, hashCodeMethod)
        }
        treeCopy.ClassDef(cd, mods, name, tparams, newTemplate)

      case _ => super.transform(tree)
    }

    def hasToStringAnnotation(mods: Modifiers): Boolean = {
      mods.annotations.exists(_.tpe.typeSymbol.name.toString == "ToString")
    }

    def hasEqualsHashCodeAnnotation(mods: Modifiers): Boolean = {
      mods.annotations.exists(_.tpe.typeSymbol.name.toString == "EqualsHashCode")
    }

    def generateToStringMethod(classDef: ClassDef): DefDef = {
      val className = classDef.name.toString
      val fields = extractFields(classDef)
      val fieldStrings = fields.map(field => q"""${field.name.toString} + "=" + $field""")
      val body = q"""$className + "(" + List(..$fieldStrings).mkString(", ") + ")" """

      q"override def toString: String = $body"
    }

    def generateEqualsMethod(classDef: ClassDef): DefDef = {
      val className = classDef.name
      val fields = extractFields(classDef)
      val fieldComparisons = fields.map(field => q"this.$field == other.$field")
      val allComparisons = fieldComparisons.reduce((a, b) => q"$a && $b")

      q"""
      override def equals(obj: Any): Boolean = obj match {
        case other: $className => $allComparisons
        case _ => false
      }
      """
    }

    def generateHashCodeMethod(classDef: ClassDef): DefDef = {
      val fields = extractFields(classDef)
      val fieldHashes = fields.map(field => q"$field.hashCode()")

      q"override def hashCode(): Int = Objects.hash(..$fieldHashes)"
    }

    def extractFields(classDef: ClassDef): List[ValDef] = {
      classDef.impl.body.collect {
        case field: ValDef if field.mods.hasFlag(Flag.PARAMACCESSOR) => field
      }
    }
  }
}

// Plugin for dependency injection
class DIPlugin(val global: Global) extends Plugin {
  import global._

  val name = "dependency-injection"
  val description = "Automatically wires dependencies"
  val components = List[PluginComponent](DIComponent)

  // Implementation would include:
  // - Scanning for @Inject annotations
  // - Building dependency graph
  // - Generating constructor injection code
  // - Validating circular dependencies
}

Testing Compiler Plugins

import scala.tools.nsc.Settings
import scala.tools.nsc.reporters.ConsoleReporter
import scala.tools.nsc.Global
import scala.tools.nsc.io.VirtualDirectory
import scala.tools.nsc.util.BatchSourceFile

class CompilerPluginTest {

  def testPlugin(sourceCode: String, plugin: String): Unit = {
    val settings = new Settings()
    settings.outputDirs.setSingleOutput(new VirtualDirectory("(memory)", None))
    settings.plugin.value = List(plugin)

    val reporter = new ConsoleReporter(settings)
    val compiler = new Global(settings, reporter)

    val sourceFile = new BatchSourceFile("test.scala", sourceCode)
    val run = new compiler.Run()
    run.compileSources(List(sourceFile))

    assert(!reporter.hasErrors, "Compilation should succeed")
  }

  def testLoggingPlugin(): Unit = {
    val sourceCode = """
      class TestClass {
        def testMethod(x: Int): Int = {
          x * 2
        }
      }
    """

    testPlugin(sourceCode, "path/to/logging-plugin.jar")
  }

  def testPerformancePlugin(): Unit = {
    val sourceCode = """
      class TestClass {
        def inefficientMethod(): String = {
          var result = ""
          for (i <- 1 to 100) {
            result = result + i.toString  // Should trigger warning
          }
          result
        }
      }
    """

    testPlugin(sourceCode, "path/to/performance-plugin.jar")
  }
}

Advanced Meta-Programming Patterns

Type-Safe Configuration with Macros

import scala.quoted._

// Compile-time configuration validation
case class DatabaseConfig(host: String, port: Int, database: String)
case class ServerConfig(port: Int, threads: Int)
case class AppConfig(database: DatabaseConfig, server: ServerConfig)

object ConfigMacros {

  // Validate configuration at compile time
  inline def validateConfig[T](config: T): T = ${validateConfigImpl('config)}

  def validateConfigImpl[T: Type](config: Expr[T])(using Quotes): Expr[T] = {
    import quotes.reflect._

    config.asTerm match {
      case Inlined(_, _, Apply(_, args)) =>
        validateConfigFields(args)
        config
      case _ =>
        report.error("Invalid configuration structure")
        config
    }
  }

  def validateConfigFields(args: List[Term])(using Quotes): Unit = {
    import quotes.reflect._

    args.foreach {
      case Literal(StringConstant(host)) if host.isEmpty =>
        report.error("Host cannot be empty")
      case Literal(IntConstant(port)) if port <= 0 || port > 65535 =>
        report.error(s"Invalid port: $port")
      case _ => // Valid field
    }
  }

  // Generate configuration from environment variables
  inline def configFromEnv[T]: T = ${configFromEnvImpl[T]}

  def configFromEnvImpl[T: Type](using Quotes): Expr[T] = {
    import quotes.reflect._

    TypeRepr.of[T] match {
      case AppliedType(tycon, _) if tycon.typeSymbol.name == "DatabaseConfig" =>
        val host = sys.env.getOrElse("DB_HOST", "localhost")
        val port = sys.env.getOrElse("DB_PORT", "5432").toInt
        val database = sys.env.getOrElse("DB_NAME", "default")

        '{ DatabaseConfig(${Expr(host)}, ${Expr(port)}, ${Expr(database)}) }.asExprOf[T]

      case _ =>
        report.error(s"Unsupported configuration type: ${TypeRepr.of[T]}")
        '{ ??? }.asExprOf[T]
    }
  }
}

// Usage
val dbConfig = ConfigMacros.validateConfig(DatabaseConfig("localhost", 5432, "mydb"))
val envConfig = ConfigMacros.configFromEnv[DatabaseConfig]

Code Generation for Serialization

import scala.quoted._
import scala.deriving._

object SerializationMacros {

  // Generate JSON serializer
  inline def generateJsonSerializer[T]: JsonSerializer[T] = ${generateJsonSerializerImpl[T]}

  def generateJsonSerializerImpl[T: Type](using Quotes): Expr[JsonSerializer[T]] = {
    import quotes.reflect._

    val tpe = TypeRepr.of[T]
    tpe.asType match {
      case '[t] =>
        Expr.summon[Mirror.Of[t]] match {
          case Some(mirror) => 
            mirror.asTerm match {
              case Inlined(_, _, Apply(_, _)) =>
                generateProductSerializer[t]()
              case _ =>
                generateSumSerializer[t]()
            }
          case None =>
            report.error(s"Cannot derive JsonSerializer for ${tpe.show}")
            '{ ??? }
        }
    }
  }

  def generateProductSerializer[T: Type]()(using Quotes): Expr[JsonSerializer[T]] = {
    import quotes.reflect._

    val fields = getFieldNames[T]()
    val serializers = getFieldSerializers[T]()

    '{
      new JsonSerializer[T] {
        def serialize(value: T): String = {
          val fields = ${Expr.ofList(fields)}.zip(${Expr.ofList(serializers)})
          val json = fields.map { case (name, serializer) =>
            s""""$name": ${serializer.serialize(getFieldValue(value, name))}"""
          }.mkString("{", ",", "}")
          json
        }
      }
    }
  }

  def generateSumSerializer[T: Type]()(using Quotes): Expr[JsonSerializer[T]] = {
    import quotes.reflect._

    '{
      new JsonSerializer[T] {
        def serialize(value: T): String = {
          // Pattern match on sum type cases
          value match {
            case _ => """{"type": "unknown"}"""
          }
        }
      }
    }
  }

  def getFieldNames[T: Type]()(using Quotes): List[Expr[String]] = {
    // Extract field names from case class
    List('{"name"}, '{"value"}) // Simplified
  }

  def getFieldSerializers[T: Type]()(using Quotes): List[Expr[JsonSerializer[Any]]] = {
    // Get serializers for each field type
    List('{ JsonSerializer.stringSerializer }, '{ JsonSerializer.intSerializer }) // Simplified
  }
}

trait JsonSerializer[T] {
  def serialize(value: T): String
}

object JsonSerializer {
  given stringSerializer: JsonSerializer[String] = (value: String) => s""""$value""""
  given intSerializer: JsonSerializer[Int] = (value: Int) => value.toString
  given booleanSerializer: JsonSerializer[Boolean] = (value: Boolean) => value.toString

  // Automatic derivation using macro
  inline given derived[T]: JsonSerializer[T] = SerializationMacros.generateJsonSerializer[T]
}

// Usage
case class Person(name: String, age: Int, isStudent: Boolean)

val personSerializer = summon[JsonSerializer[Person]]
val json = personSerializer.serialize(Person("Alice", 25, false))
println(json) // {"name": "Alice", "age": 25, "isStudent": false}

Performance-Oriented Meta-Programming

object PerformanceMacros {

  // Unroll loops at compile time
  inline def unroll[T](inline times: Int)(inline body: Int => T): List[T] = {
    ${unrollImpl('times, 'body)}
  }

  def unrollImpl[T: Type](times: Expr[Int], body: Expr[Int => T])(using Quotes): Expr[List[T]] = {
    import quotes.reflect._

    times.value match {
      case Some(n) =>
        val expressions = (0 until n).map { i =>
          '{ ${body}(${Expr(i)}) }
        }.toList
        '{ List(${Expr.ofList(expressions)}: _*) }

      case None =>
        report.error("Loop count must be a compile-time constant")
        '{ List.empty[T] }
    }
  }

  // Generate specialized methods for different types
  inline def generateSpecialized[T]: SpecializedOps[T] = ${generateSpecializedImpl[T]}

  def generateSpecializedImpl[T: Type](using Quotes): Expr[SpecializedOps[T]] = {
    import quotes.reflect._

    TypeRepr.of[T] match {
      case tpe if tpe =:= TypeRepr.of[Int] =>
        '{
          new SpecializedOps[T] {
            def add(a: T, b: T): T = (a.asInstanceOf[Int] + b.asInstanceOf[Int]).asInstanceOf[T]
            def multiply(a: T, b: T): T = (a.asInstanceOf[Int] * b.asInstanceOf[Int]).asInstanceOf[T]
            def zero: T = 0.asInstanceOf[T]
          }
        }

      case tpe if tpe =:= TypeRepr.of[Double] =>
        '{
          new SpecializedOps[T] {
            def add(a: T, b: T): T = (a.asInstanceOf[Double] + b.asInstanceOf[Double]).asInstanceOf[T]
            def multiply(a: T, b: T): T = (a.asInstanceOf[Double] * b.asInstanceOf[Double]).asInstanceOf[T]
            def zero: T = 0.0.asInstanceOf[T]
          }
        }

      case _ =>
        report.error(s"No specialization available for type ${TypeRepr.of[T].show}")
        '{ ??? }
    }
  }

  // Compile-time lookup table generation
  inline def generateLookupTable[K, V](inline entries: (K, V)*): Map[K, V] = {
    ${generateLookupTableImpl('entries)}
  }

  def generateLookupTableImpl[K: Type, V: Type](entries: Expr[Seq[(K, V)]])(using Quotes): Expr[Map[K, V]] = {
    import quotes.reflect._

    // Extract compile-time constant entries
    val constantEntries = entries.asTerm match {
      case Inlined(_, _, Repeated(items, _)) =>
        items.map {
          case '{ ($key: K, $value: V) } => 
            (key.asTerm, value.asTerm)
        }
      case _ =>
        report.error("Lookup table entries must be compile-time constants")
        List.empty
    }

    // Generate optimized map implementation
    val size = constantEntries.length
    if (size <= 8) {
      // Generate if-else chain for small maps
      generateIfElseMap[K, V](constantEntries)
    } else {
      // Generate hash table for larger maps
      generateHashMap[K, V](constantEntries)
    }
  }

  def generateIfElseMap[K: Type, V: Type](entries: List[(Term, Term)])(using Quotes): Expr[Map[K, V]] = {
    import quotes.reflect._

    val entryExprs = entries.map { case (k, v) => 
      '{ (${k.asExprOf[K]}, ${v.asExprOf[V]}) }
    }

    '{ Map(${Expr.ofList(entryExprs)}: _*) }
  }

  def generateHashMap[K: Type, V: Type](entries: List[(Term, Term)])(using Quotes): Expr[Map[K, V]] = {
    import quotes.reflect._

    // Generate optimized hash map for larger collections
    val entryExprs = entries.map { case (k, v) => 
      '{ (${k.asExprOf[K]}, ${v.asExprOf[V]}) }
    }

    '{ Map(${Expr.ofList(entryExprs)}: _*) }
  }
}

trait SpecializedOps[T] {
  def add(a: T, b: T): T
  def multiply(a: T, b: T): T
  def zero: T
}

// Usage examples
val unrolledResult = PerformanceMacros.unroll(5)(i => i * i) // [0, 1, 4, 9, 16]

val intOps = PerformanceMacros.generateSpecialized[Int]
val sum = intOps.add(5, 3) // Specialized int addition

val lookup = PerformanceMacros.generateLookupTable(
  "one" -> 1,
  "two" -> 2,
  "three" -> 3
)

Best Practices and Guidelines

When to Use Meta-Programming

// Good use cases for meta-programming:

// 1. Eliminating boilerplate code
trait JsonCodec[T] {
  def encode(value: T): String
  def decode(json: String): T
}

// Instead of writing manual codecs, generate them
case class User(id: Int, name: String, email: String)
given JsonCodec[User] = JsonCodec.derived // Generated automatically

// 2. Type-safe DSLs
val query = sql"SELECT * FROM users WHERE age > $minAge"
// Macro ensures SQL syntax is correct and parameters are properly typed

// 3. Compile-time validation
inline def validateUrl(inline url: String): String = {
  inline if (!url.startsWith("http")) {
    compiletime.error("URL must start with http")
  } else url
}

val apiUrl = validateUrl("https://api.example.com") // Validated at compile time

// 4. Performance optimizations
inline def fastPower(base: Double, inline exponent: Int): Double = {
  inline if (exponent == 0) 1.0
  else inline if (exponent == 1) base
  else inline if (exponent == 2) base * base
  else math.pow(base, exponent) // Fallback for other cases
}

// Bad use cases - avoid meta-programming for:

// 1. Simple transformations that can be done with regular functions
// Don't: macro for simple string manipulation
// Do: regular function
def capitalize(s: String): String = s.capitalize

// 2. Complex business logic that should be explicit
// Don't: hide important business rules in macros
// Do: make business logic explicit and testable

// 3. Debugging-unfriendly transformations
// Don't: complex code generation that's hard to debug
// Do: simple, understandable transformations

Testing Meta-Programming Code

import scala.quoted.*
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest.matchers.should.Matchers

class MacroTest extends AnyFlatSpec with Matchers {

  "Validation macro" should "accept valid input" in {
    val result = validateEmail("user@example.com")
    result should be("user@example.com")
  }

  it should "reject invalid input at compile time" in {
    // This test verifies that invalid input causes compilation error
    assertTypeError("""validateEmail("invalid-email")""")
  }

  "Code generation macro" should "generate correct code" in {
    case class TestCase(field1: String, field2: Int)

    val instance = TestCase("test", 42)
    val json = summon[JsonSerializer[TestCase]].serialize(instance)

    json should include("field1")
    json should include("test")
    json should include("field2")
    json should include("42")
  }

  "Inline function" should "be optimized away" in {
    // Test that inline functions produce expected results
    square(5) should be(25)
    factorial(5) should be(120)
  }

  def assertTypeError(code: String): Unit = {
    // Helper to verify compile-time errors
    // Implementation would use compiler API to verify error
  }
}

// Property-based testing for generated code
import org.scalacheck.Properties
import org.scalacheck.Prop.forAll

object GeneratedCodeProperties extends Properties("GeneratedCode") {

  property("serialization round-trip") = forAll { (name: String, age: Int) =>
    val person = Person(name, age, false)
    val json = summon[JsonSerializer[Person]].serialize(person)
    val deserialized = summon[JsonDeserializer[Person]].deserialize(json)

    deserialized == person
  }

  property("specialized operations") = forAll { (a: Int, b: Int) =>
    val ops = PerformanceMacros.generateSpecialized[Int]
    val result = ops.add(a, b)

    result == (a + b)
  }
}

Performance and Compilation Time

// Best practices for compilation performance:

object CompilationOptimizations {

  // 1. Minimize macro expansion depth
  inline def simpleTransform[T](value: T): T = value // Good: simple

  // Avoid deep recursion in macros
  // inline def complexTransform[T](value: T): T = ${deepRecursiveMacro('value)} // Bad

  // 2. Cache expensive computations
  object TypeLevelCache {
    type ComputedType[T] = T match {
      case String => List[String]
      case Int => List[Int]
      case _ => List[Any]
    }

    // Cache results to avoid recomputation
    private val cache = scala.collection.mutable.Map.empty[String, Any]

    inline def getCachedResult[T](inline key: String): T = {
      cache.getOrElseUpdate(key, computeExpensive[T]).asInstanceOf[T]
    }

    def computeExpensive[T]: T = ??? // Expensive computation
  }

  // 3. Use appropriate tools for the task
  // For simple code generation: use inline functions
  inline def generateSimpleGetter[T](inline fieldName: String): String = 
    s"def get${fieldName.capitalize}: T"

  // For complex transformations: use compiler plugins
  // For type-level programming: use match types and given instances

  // 4. Minimize dependencies in meta-programming code
  // Keep macro implementations simple and focused

  // 5. Profile compilation times
  // Use -Xprint:all to see generated code
  // Use -Ystatistics to see compilation statistics
}

// Debugging generated code
object DebuggingTechniques {

  // 1. Print generated code during compilation
  inline def debugMacro[T](value: T): T = {
    println(s"Macro expanded with value: $value") // Compile-time print
    value
  }

  // 2. Use -Xprint compiler flag to see transformations
  // scalac -Xprint:typer MyFile.scala

  // 3. Simplify complex macros step by step
  def debugComplexMacro[T: Type](expr: Expr[T])(using Quotes): Expr[T] = {
    import quotes.reflect._

    // Print AST structure
    println(s"Input AST: ${expr.asTerm.show}")

    // Apply transformation step by step
    val step1 = transformStep1(expr)
    println(s"After step 1: ${step1.asTerm.show}")

    val step2 = transformStep2(step1)
    println(s"After step 2: ${step2.asTerm.show}")

    step2
  }

  def transformStep1[T: Type](expr: Expr[T])(using Quotes): Expr[T] = expr
  def transformStep2[T: Type](expr: Expr[T])(using Quotes): Expr[T] = expr

  // 4. Unit test macro components separately
  def testMacroComponent(): Unit = {
    val input = "test input"
    val result = processMacroInput(input)
    assert(result == "expected output")
  }

  def processMacroInput(input: String): String = {
    // Testable logic extracted from macro
    input.toUpperCase
  }
}

Conclusion

Compiler plugins and macro programming in Scala provide powerful capabilities for extending the language and creating sophisticated development tools. Key takeaways include:

Meta-Programming Capabilities:

  • Compiler plugins for custom language extensions
  • Inline functions for compile-time evaluation
  • Macros for code generation and transformation
  • Type-level programming for compile-time guarantees

Practical Applications:

  • Automatic code generation (serialization, boilerplate elimination)
  • Domain-specific languages with compile-time validation
  • Performance optimizations through code specialization
  • Static analysis and code quality tools

Advanced Techniques:

  • AST manipulation and transformation
  • Compile-time error reporting and validation
  • Type-safe configuration and dependency injection
  • Performance-oriented meta-programming patterns

Best Practices:

  • Use meta-programming judiciously - only when it provides clear benefits
  • Keep generated code simple and debuggable
  • Test meta-programming code thoroughly
  • Consider compilation time and maintainability
  • Document generated code and transformation logic

Tools and Ecosystem:

  • Scala Meta for modern meta-programming
  • Compiler API for advanced transformations
  • Testing frameworks for meta-programming code
  • Debugging techniques for generated code

Performance Considerations:

  • Minimize compilation overhead
  • Cache expensive computations
  • Profile and optimize compilation times
  • Use appropriate abstractions for each use case

Meta-programming is a powerful tool that should be used thoughtfully. When applied correctly, it can eliminate boilerplate, enforce correctness at compile time, and create more maintainable codebases. However, it's important to balance the benefits against complexity and maintainability concerns.

The future of Scala meta-programming continues to evolve with Scala 3's improved inline capabilities, better macro system, and enhanced compiler plugin architecture, making it easier to build sophisticated development tools and language extensions.