Add unit tests for the append_attention op.
Source: look for append_attention in custom_ops/gpu_ops/
Registration: custom_ops/gpu_ops/cpp_extensions.cc
Test file: tests/operators/test_append_attention.py
Key attention mechanism op. Tests should verify correctness against a naive attention implementation, with different sequence lengths, head counts, and head dims.
Branch: task/047-append-attention-test