<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.31 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-ietf-bmwg-savnet-sav-benchmarking-01" category="info" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.32.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Intra-domain and Inter-domain Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-ietf-bmwg-savnet-sav-benchmarking-01"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2026" month="March" day="17"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of intra-domain and inter-domain source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain (intra-AS) and inter-domain (inter-AS) SAV mechanisms have problems in operational overhead and SAV accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchmarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support multiple SAV mechanisms, allowing operators to enable those most suitable for their specific network environments. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy (i.e., false positive and false negative rates), SAV protocol convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a software router, a virtual machine (VM) instance, or a container instance, which runs as a SAV device. This document outlines methodologies for assessing SAV device performance and comparing various SAV mechanisms and implementations.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing “which SAV mechanism performs best” over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems' performance (also known as “micro-benchmark”).</t>
          </li>
        </ul>
        <t>This benchmark evaluates the SAV performance of individual devices (e.g., hardware/software routers) by comparing different SAV mechanisms under specific network scenarios. The results help determine the appropriate SAV deployment for real-world network scenarios.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An edge router directly connected to a layer-2 host network.</t>
      <t>Customer-facing Router: An edge router connected to a non-BGP customer network which includes routers and runs the routing protocol.</t>
      <t>AS Border Router: An intra-domain router facing an external AS.</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The positioning of the DUT within the network topology has an impact on SAV performance. Therefore, the benchmarking process <bcp14>MUST</bcp14> include evaluating the DUT at multiple locations across the network to ensure a comprehensive assessment.</t>
        <t>The routing configurations of network devices may differ, and the resulting SAV rules depend on these settings. It is essential to clearly document the specific device configurations used during testing.</t>
        <t>Furthermore, the role of each device, such as host-facing router, customer-facing router, or AS border router in an intra-domain network, <bcp14>SHOULD</bcp14> be clearly identified. In an inter-domain context, the business relationships between ASes <bcp14>MUST</bcp14> also be specified.</t>
        <t>When evaluating data plane forwarding performance, the traffic generated by the Tester must be characterized by defined traffic rates, the ratio of spoofed to legitimate traffic, and the distribution of source addresses, as all of these factors can influence test results.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>SHOULD</bcp14> be measured in the benchmarking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>SHOULD</bcp14> be measured from the result output of the DUT.
The standard deviation for KPIs' testing results <bcp14>SHOULD</bcp14> be analyzed for each fixed test setup, which can help understand the stability of the DUT's performance. The data plane SAV table refreshing rate and data plane forwarding rate below <bcp14>SHOULD</bcp14> be tested using varying SAV table sizes for each fixed test setup, which can help measure DUT's sensibility to the SAV table size for these two KPIs.</t>
      <section anchor="false-positive-rate">
        <name>False Positive Rate</name>
        <t>The proportion of legitimate packets that are incorrectly classified as spoofed by the DUT to the total number of legitimate packets sent to the DUT. For the purpose of this document, this metric is computed based on packet counts (i.e., on a per-packet basis).</t>
        <t>Note that other computation methods (e.g., based on byte counts) <bcp14>MAY</bcp14> be used for supplementary analysis, but are outside the scope of the normative definitions in this document.</t>
      </section>
      <section anchor="false-negative-rate">
        <name>False Negative Rate</name>
        <t>The proportion of spoofed packets that are incorrectly classified as legitimate (i.e., not blocked) by the DUT to the total number of spoofed packets sent to the DUT. For the purpose of this document, this metric is computed based on packet counts (i.e., on a per-packet basis).</t>
        <t>Note that other computation methods (e.g., based on byte counts) <bcp14>MAY</bcp14> be used for supplementary analysis, but are outside the scope of the normative definitions in this document.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the beginning of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT. It is suggested that measuring the data plane forwarding rate of DUT enabling and disabling SAV to see the proportion of decrease for the data plane forwarding rate. This can help analyze the efficiency for SAV data plane implementation of the DUT.</t>
      </section>
      <section anchor="resource-utilization">
        <name>Resource Utilization</name>
        <t>The resource utilization refers to the CPU and memory usage of the SAV processes within the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra_domain_sav">
        <name>Intra-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Evaluate the false positive rate and false negative rate of the DUT in processing both legitimate and spoofed traffic across various intra-domain network scenarios. These scenarios include SAV implementations for customer/host networks, Internet-facing networks, and aggregation-router-facing networks.</t>
          <t>In the following, this document presents the test scenarios for evaluating intra-domain SAV performance on the DUT. Under each scenario, the generated spoofed traffic <bcp14>SHOULD</bcp14> include different types of forged source addresses, such as unused source addresses within the subnetwork, private network source addresses, internal-use-only source addresses of the subnetwork, and external source addresses. The ratios among these different types of forged source addresses <bcp14>SHOULD</bcp14> vary, since different SAV mechanisms may differ in their capability to block packets with forged source addresses of various types. Nevertheless, for all these types of spoofed traffic, the expected result is that the DUT <bcp14>SHOULD</bcp14> block them.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                 |
|                         +~~~~~~~~~~+                       |
|                         | Router 1 |                       |
| FIB on DUT              +~~~~~~~~~~+                       |
| Dest           Next_hop   /\    |                          |
| 2001:db8::/55  Network 1   |    |                          |
|                            |    \/                         |
|                         +----------+                       |
|                         |   DUT    |                       |
|                         +----------+                       |
|                           /\    |                          |
|               Traffic with |    | Traffic with             |
|        source IP addresses |    | destination IP addresses |
|           of 2001:db8::/55 |    | of 2001:db8::/55         |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                             |    \/
                      +--------------------+
                      |Tester (Sub Network)|
                      | ((2001:db8::/55))  |
                      +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> illustrates an intra-domain symmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router and connects to Router 1 for Internet access. A sub network, which resides within the AS and uses the prefix 2001:db8::/55, is connected to the DUT. The Tester emulates a sub network by advertising this prefix in the control plane and generating both spoofed and legitimate traffic in the data plane. In this setup, the Tester is configured so that inbound traffic destined for 2001:db8::/55 arrives via the DUT. The DUT learns the route to 2001:db8::/55 from the Tester, while the Tester sends outbound traffic with source addresses within 2001:db8::/55 to the DUT, simulating a symmetric routing scenario between the two. The IP addresses used in this test case are optional; users may substitute them with other addresses, as applies equally to other test cases.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To verify whether the DUT can generate accurate SAV rules for customer or host network under symmetric routing conditions, construct a testbed as depicted in <xref target="intra-domain-customer-syn"/>. The Tester is connected to the DUT and acts as a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (with source addresses in 2001:db8::/55) and spoofed traffic (with source addresses in 2001:db8:0:200::/55) toward the DUT. The prefix 2001:db8:0:200::/55 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                         Test Network Environment                       |
|                             +~~~~~~~~~~+                               |
|                             | Router 2 |                               |
| FIB on DUT                  +~~~~~~~~~~+   FIB on Router 1             |
| Dest                Next_hop  /\      \    Dest               Next_hop |
| 2001:db8::/56       Network 1 /        \ 2001:db8:0:100::/56  Network 1|
| 2001:db8:0:100::/56 Router 2 /         \/ 2001:db8::/56       Router 2 |
|                    +----------+     +~~~~~~~~~~+                       |
|                    |   DUT    |     | Router 1 |                       |
|                    +----------+     +~~~~~~~~~~+                       |
|                       /\               /                               |
|           Traffic with \              / Traffic with                   |
|     source IP addresses \            / destination IP addresses        |
|   of 2001:db8:0:100::/56 \          / of 2001:db8:0:100::/56           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                             \      \/
                      +---------------------+
                      | Tester (Sub Network)|
                      |   (2001:db8::/55)   |
                      +---------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-asyn"/> illustrates an intra-domain asymmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router. A sub network, i.e., a customer/host network within the AS, is connected to both the DUT and Router 1, and uses the prefix 2001:db8::/55. The Tester emulates a sub network and handles both its control plane and data plane functions. In this setup, the Tester is configured so that inbound traffic destined for 2001:db8::/56 is received only from the DUT, while inbound traffic for 2001:db8:0:100::/56 is received only from Router 1. The DUT learns the route to prefix 2001:db8::/56 from the Tester, and Router 1 learns the route to 2001:db8:0:100::/56 from the Tester. Both the DUT and Router 1 then advertise their respective learned prefixes to Router 2. Consequently, the DUT learns the route to 2001:db8:0:100::/56 from Router 2, and Router 1 learns the route to 2001:db8::/56 from Router 2. The Tester sends outbound traffic with source addresses in 2001:db8:0:100::/56 to the DUT, simulating an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To determine whether the DUT can generate accurate SAV rules under asymmetric routing conditions, set up the test environment as shown in <xref target="intra-domain-customer-asyn"/>. The Tester is connected to both the DUT and Router 1 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 2001:db8::/56) and legitimate traffic (using source addresses in 2001:db8:0:100::/56) toward the DUT. The prefix 2001:db8::/56 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 2001:db8:0:200::/55
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 2001:db8::/55  Network 1 |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 2001:db8::/55 |    | of 2001:db8::/55       |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |    Sub Network     |                  |
|                  |   (2001:db8::/55)  |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network under intra-domain symmetric routing conditions. The network topology resembles that of <xref target="intra-domain-customer-syn"/>, with the key difference being the positioning of the DUT. In this case, the DUT is connected to Router 1 and the Internet, while the Tester emulates the Internet. The DUT performs SAV from an Internet-facing perspective, as opposed to a customer/host-facing role.</t>
          <t>The <strong>procedure</strong> for testing SAV for an Internet-facing network in an intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under symmetric routing, set up the test environment as depicted in <xref target="intra-domain-internet-syn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 2001:db8::/55) and legitimate traffic (using source addresses in 2001:db8:0:200::/55) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 2001:db8:0:200::/55
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                                    |
|                          |   \/                                    |
|                        +----------+                                |
|                        |    DUT   |                                |
| FIB on Router 1        +----------+  FIB on Router 2               |
| Dest           Next_hop  /\     \   Dest                 Next_hop  |
| 2001:db8::/56  Network 1 /       \  2001:db8:0:100::/56 Network 1  |
| 2001:db8:0:100::/56 DUT /        \/ 2001:db8::/56  DUT             |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                        |
|               | Router 1 |     | Router 2 |                        |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                        |
|                     /\            /                                |
|         Traffic with \           / Traffic with                    |
|   source IP addresses \         / destination IP addresses         |
|         of 2001:db8:0:100::/56 / of 2001:db8:0:100::/56            |
|                         \     \/                                   |
|                  +--------------------+                            |
|                  |    Sub Network     |                            |
|                  |   (2001:db8::/55)  |                            |
|                  +--------------------+                            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-asyn"/> illustrates a test case for SAV in an Internet-facing network under intra-domain asymmetric routing conditions. The network topology is identical to that of <xref target="intra-domain-customer-asyn"/>, with the key distinction being the placement of the DUT. In this scenario, the DUT is connected to Router 1 and Router 2 within the same AS, as well as to the Internet. The Tester emulates the Internet, and the DUT performs Internet-facing SAV rather than customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under asymmetric routing, construct the test environment as shown in <xref target="intra-domain-internet-asyn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 2001:db8::/55) and legitimate traffic (using source addresses in 2001:db8:0:200::/55) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest           Next_hop  /\    |                         |
| 2001:db8::/55  Network 1 |     |                         |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|         of 2001:db8::/55 |    | of 2001:db8::/55         |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (2001:db8::/55)  |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network under intra-domain symmetric routing conditions. The network topology in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is connected to Router 1 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for aggregation-router-facing SAV under symmetric routing, construct the test environment as shown in <xref target="intra-domain-agg-syn"/>. The Tester is connected to Router 1 and emulates a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (using source addresses in 2001:db8::/56) and spoofed traffic (using source addresses in 2001:db8:0:200::/55) toward Router 1. The prefix 2001:db8:0:200::/55 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                           |
|                         +----------+                                 |
|                         |    DUT   | SAV facing Router 1 and 2       |
| FIB on Router 1         +----------+   FIB on Router 2               |
| Dest            Next_hop  /\      \    Dest                Next_hop  |
| 2001:db8::/56   Network 1 /        \  2001:db8:0:100::/56  Network 1 |
| 2001:db8:0:100::/56  DUT /         \/ 2001:db8::/56        DUT       |
|                  +~~~~~~~~~~+    +~~~~~~~~~~+                        |
|                  | Router 1 |    | Router 2 |                        |
|                  +~~~~~~~~~~+    +~~~~~~~~~~+                        |
|                       /\           /                                 |
|           Traffic with \          / Traffic with                     |
|     source IP addresses \        / destination IP addresses          |
|         of 2001:db8::/56 \      / of 2001:db8:0:100::/56             |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            \   \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |  (2001:db8::/55)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-asyn"/> illustrates the test case for SAV in an aggregation-router-facing network under intra-domain asymmetric routing conditions. The network topology in <xref target="intra-domain-agg-asyn"/> is identical to that of <xref target="intra-domain-internet-asyn"/>. The Tester is connected to both Router 1 and Router 2 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under asymmetric routing conditions, construct the test environment as shown in <xref target="intra-domain-agg-asyn"/>. The Tester is connected to Router 1 and Router 2 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish an asymmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 2001:db8:0:200::/55) and legitimate traffic (using source addresses in 2001:db8::/56) toward Router 1. The prefix 2001:db8:0:200::/55 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1 and Router 2.</t>
          <figure anchor="intra-domain-frr-topo">
            <name>Intra-domain SAV under Fast Reroute (FRR) scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|     +------------+                     +------------+       |
|     |   Router2  |---------------------|   Router3  |       |
|     +------------+                     +------------+       |
|          /\                                  /\             |
|          |                                   |              |
|          | backup path                       | primary path |
|          |                                   |              |
|     +-----------------------------------------------+       |
|     |                     DUT                       |       |
|     +-----------------------------------------------+       |
|                           /\                                |
|                           | Legitimate and                  |
|                           | Spoofed Traffic                 |
|                           |                                 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            |
                  +--------------------+
                  |Tester (Sub Network)|
                  +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV under Fast Reroute (FRR) Scenario</strong>: Fast Reroute (FRR) mechanisms such as Loop-Free Alternates (LFA) or Topology-Independent Loop-Free Alternates (TI-LFA) provide sub-second restoration of traffic forwarding after link or node failures. During FRR activation, temporary forwarding changes may occur before the control plane converges, potentially impacting SAV rule consistency and causing transient
false positives or false negatives.</t>
          <t>The <strong>procedure</strong> for testing SAV under FRR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure the DUT and adjacent routers with FRR protection for the primary link (Router3–DUT).</t>
            </li>
            <li>
              <t>The Tester continuously sends legitimate and spoofed traffic toward the protected prefix.</t>
            </li>
            <li>
              <t>Trigger a link failure between Router3 and the DUT, causing FRR switchover to Router2.</t>
            </li>
            <li>
              <t>Measure false positive and false negative rates during the switchover and after reconvergence.</t>
            </li>
            <li>
              <t>Restore the primary link and verify that SAV rules revert correctly.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT should maintain correct SAV behavior throughout FRR activation and recovery. False positive and false negative rates <bcp14>SHOULD</bcp14> remain minimal during FRR events, and SAV rules <bcp14>SHOULD</bcp14> update promptly to reflect restored routing.</t>
          <figure anchor="intra-domain-pbr-topo">
            <name>Intra-domain SAV under Policy-based Routing (PBR) scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|             Test Network Environment           |
|                 +------------+                 |
|                 |  Router2   |                 |
|                 +------------+                 |
|                       /\                       |
|                        | default path          |
|                        |                       |
|               +----------------+               |
|               |       DUT      |               |
|               +----------------+               |
|                 /\           /\                |
|    policy-based /             \ default path   |
|           path /               \               |
|         +-----------+      +-----------+       |
|         |  Router3  |      |  Router1  |       |
|         +-----------+      +-----------+       |
|              /\                 /\             |
|              |                   |             |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
               |                   |
          +-----------------------------+
          |     Tester (Sub Network)    |
          +-----------------------------+
]]></artwork>
          </figure>
          <t><strong>SAV under Policy-based Routing (PBR) Scenario</strong>: Policy-based Routing (PBR) enables forwarding decisions based on user-defined match conditions (e.g., source prefix, DSCP, or interface) instead of the standard routing table. Such policies can create asymmetric paths that challenge the SAV mechanism if rules are derived solely from RIB or FIB information.</t>
          <t>The <strong>procedure</strong> for testing SAV under PBR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure PBR on the DUT to forward traffic matching a specific source prefix (e.g., 2001:db8::/56) to Router3, while other traffic follows the default path to Router1.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate and spoofed traffic that matches and does not match the PBR policy.</t>
            </li>
            <li>
              <t>Measure the false positive and false negative rates for both traffic types.</t>
            </li>
            <li>
              <t>Dynamically modify or remove the PBR policy and observe SAV rule adaptation.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>SHOULD</bcp14> continue to correctly filter spoofed packets and permit legitimate packets under PBR scenario. SAV rules <bcp14>MUST</bcp14> adapt to policy-based forwarding paths without producing misclassification.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, including both protocol convergence performance and protocol message processing performance in response to route changes caused by network failures or operator configurations. Protocol convergence performance is quantified by the convergence time, defined as the duration from the onset of a routing change until the completion of the corresponding SAV rule update. Protocol message processing performance is measured by the processing throughput, represented by the total size of protocol messages processed per second.</t>
          <t>Note that the tests for control plane performance of the DUT which performs intra-domain SAV are <bcp14>OPTIONAL</bcp14>. Only DUT which implements the SAV mechanism using an explicit control-plane communication protocol, such as SAV-specific information communication mechanism proposed in <xref target="intra-domain-arch"/> <bcp14>SHOULD</bcp14> be tested on its control plane performance.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The convergence process of the DUT, during which SAV rules are updated, is triggered by route changes resulting from network failures or operator configurations. In <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates these route changes by adding or withdrawing prefixes to initiate the DUT's convergence procedure.</t>
          <t>The <strong>procedure</strong> for testing protocol convergence performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol convergence time of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester withdraws a specified percentage of the total prefixes supported by the DUT, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The protocol convergence time is calculated based on DUT logs that record the start and completion times of the convergence process.</t>
            </li>
          </ol>
          <t>Please note that for IGP, proportional prefix withdrawal can be achieved by selectively shutting down interfaces. For instance, if the Tester is connected to ten emulated devices through ten interfaces, each advertising a prefix, withdrawing 10% of prefixes can be accomplished by randomly disabling one interface. Similarly, 20% withdrawal corresponds to shutting down two interfaces, and so forth. This is one suggested method, and other approaches that achieve the same effect should be also acceptable.</t>
          <t>The protocol convergence time, defined as the duration required for the DUT to complete the convergence process, should be measured from the moment the last “hello” message is received from the emulated device on the disabled interface until SAV rule generation is finalized. To ensure accuracy, the DUT should log the timestamp of the last hello message received and the timestamp when SAV rule updates are complete. The convergence time is the difference between these two timestamps.</t>
          <t>It is recommended that if the emulated device sends a “goodbye hello” message during interface shutdown, using the receipt time of this message, rather than the last standard hello, as the starting point will provide a more precise measurement, as advised in <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test for protocol message processing performance uses the same setup illustrated in <xref target="intra-convg-perf"/>. This performance metric evaluates the protocol message processing throughput, the rate at which the DUT processes protocol messages. The Tester varies the sending rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. The DUT records both the total size of processed protocol messages and the corresponding processing time.</t>
          <t>The <strong>procedure</strong> for testing protocol message processing performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol message processing throughput of the DUT, set up the test environment as shown in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying rates, such as 10%, 20%, up to 100%, of the total link capacity between the Tester and the DUT.</t>
            </li>
            <li>
              <t>The protocol message processing throughput is calculated based on DUT logs that record the total size of processed protocol messages and the total processing time.</t>
            </li>
          </ol>
          <t>To compute the protocol message processing throughput, the DUT logs <bcp14>MUST</bcp14> include the total size of the protocol messages processed and the total time taken for processing. The throughput is then derived by dividing the total message size by the total processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, including both data plane SAV table refresh performance and data plane forwarding performance. Data plane SAV table refresh performance is quantified by the refresh rate, which indicates how quickly the DUT updates its SAV table with new SAV rules. Data plane forwarding performance is measured by the forwarding rate, defined as the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The evaluation of data plane SAV table refresh performance uses the same test setup shown in <xref target="intra-convg-perf"/>. This metric measures the rate at which the DUT refreshes its SAV table with new SAV rules. The Tester varies the transmission rate of protocol messages, from 10% to 100% of the total link capacity between the Tester and the DUT, to influence the proportion of updated SAV rules and corresponding SAV table entries. The DUT records the total number of updated SAV table entries and the time taken to complete the refresh process.</t>
          <t>The <strong>procedure</strong> for testing data plane SAV table refresh performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane SAV table refreshing rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying percentages of the total link capacity, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The data plane SAV table refreshing rate is calculated based on DUT logs that record the total number of updated SAV table entries and the total refresh time.</t>
            </li>
          </ol>
          <t>To compute the refresh rate, the DUT logs <bcp14>MUST</bcp14> capture the total number of updated SAV table entries and the total time required for refreshing. The refresh rate is then derived by dividing the total number of updated entries by the total refresh time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of data plane forwarding performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The Tester transmits a mixture of spoofed and legitimate traffic at a rate matching the total link capacity between the Tester and the DUT, while the DUT maintains a fully populated SAV table. The ratio of spoofed to legitimate traffic can be varied within a range, for example, from 1:9 to 9:1. The DUT records the total size of forwarded packets and the total duration of the forwarding process.</t>
          <t>The procedure for testing data plane forwarding performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane forwarding rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends a mix of spoofed and legitimate traffic to the DUT at the full link capacity between the Tester and the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>The data plane forwarding rate is calculated based on DUT logs that record the total size of forwarded traffic and the total forwarding time.</t>
            </li>
          </ol>
          <t>To compute the forwarding rate, the DUT logs must include the total size of forwarded traffic and the total time taken for forwarding. The forwarding rate is then derived by dividing the total traffic size by the total forwarding time.</t>
        </section>
      </section>
      <section anchor="inter_domain_sav">
        <name>Inter-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates-1">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Measure the false positive rate and false negative rate of the DUT when processing legitimate and spoofed traffic across multiple inter-domain network scenarios, including SAV implementations for both customer-facing ASes and provider-/peer-facing ASes.</t>
          <t>In the following, this document presents the test scenarios for evaluating inter-domain SAV performance on the DUT. Under each scenario, the generated spoofed traffic <bcp14>SHOULD</bcp14> include different types of forged source addresses, such as source addresses belonging to the local AS but not announced to external networks, private network source addresses, source addresses belonging to other ASes, and unallocated (unused) source addresses. The ratios among these different types of forged source addresses <bcp14>SHOULD</bcp14> vary, since different inter-domain SAV mechanisms may differ in their capability to block packets with forged source addresses of various origins. Nevertheless, for all these types of spoofed traffic, the expected result is that the DUT <bcp14>SHOULD</bcp14> block them.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case for SAV in customer-facing ASes under an inter-domain symmetric routing scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network environment, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which in turn is a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively. AS 2 then propagates routes for P1 and P6 to the DUT, enabling the DUT to learn these prefixes from both AS 1 and AS 2. In this test, the legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT-&gt;AS 4. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for customer-facing ASes under symmetric inter-domain routing, construct the test environment as shown in <xref target="inter-customer-syn"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-syn"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t>SAV for Customer-facing ASes: <xref target="inter-customer-lpp"/> presents a test case for SAV in customer-facing ASes under an inter-domain asymmetric routing scenario induced by NO_EXPORT community configuration. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefix P1 to AS 2 with the NO_EXPORT community attribute, preventing AS 2 from propagating the route for P1 to the DUT. Similarly, AS 1 advertises prefix P6 to the DUT with the NO_EXPORT attribute, preventing the DUT from propagating this route to AS 3. As a result, the DUT learns the route for prefix P1 only from AS 1. The legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under NO_EXPORT-induced asymmetric routing, construct the test environment as shown in <xref target="inter-customer-lpp"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary—for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-lpp"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+  AS 3(P3, P7)  |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                        P7[AS 3] /           \ P7[AS 3]          |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P3[AS 4, AS 3] /     |      \           \              |
|         P7[AS 4, AS 3] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P6, P7)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+
P7 is the anycast prefix and is advertised only by AS 3 via BGP.
Note that, unlike the other AS network figures in this document,
this figure illustrates that AS 3 advertises prefixes P3 and P7 
to AS 2 through AS 4, and does not depict the propagation of 
prefixes P1 and P6 beyond AS 2.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-dsr"/> presents a test case for SAV in customer-facing ASes under a Direct Server Return (DSR) scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to an anycast destination IP in P7, the forwarding path is AS 2-&gt;DUT-&gt;AS 3. Anycast servers in AS 3 receive the requests and tunnel them to edge servers in AS 1. The edge servers then return content to the users with source addresses in prefix P7. If the reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2, the Tester sends traffic with source addresses in P7 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2. Alternatively, if the reverse forwarding path is AS 1-&gt;AS 2, the Tester sends traffic with source addresses in P7 and destination addresses in P2 along the path AS 1-&gt;AS 2. In this case, AS 2 may serve as the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this DSR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under DSR conditions, construct the test environment as shown in <xref target="inter-customer-dsr"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the DSR scenario.</t>
            </li>
            <li>
              <t>The Tester sends legitimate traffic (with source addresses in P7 and destination addresses in P2) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT permits legitimate traffic with source addresses in P7 received from the direction of AS 1.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-dsr"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-reflect"/> illustrates a test case for SAV in customer-facing ASes under a reflection attack scenario. In this scenario, a reflection attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P5) that are configured to respond to such requests. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-reflect"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a reflection attack scenario, construct the test environment as shown in <xref target="inter-customer-reflect"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P5) toward AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-reflect"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-direct"/> presents a test case for SAV in customer-facing ASes under a direct attack scenario. In this scenario, a direct attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), aiming to overwhelm its network resources. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-direct"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a direct attack scenario, construct the test environment as shown in <xref target="inter-customer-direct"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P5 and destination addresses in P1) toward AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P5 received from the direction of AS 2.</t>
          <t>Note that DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-direct"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> illustrates a test case for SAV in provider/peer-facing ASes under a reflection attack scenario. In this scenario, the attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P2) that are configured to respond. The Tester emulates the attacker by performing source address spoofing. The servers then send overwhelming responses to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider/peer-facing ASes in a reflection attack scenario, construct the test environment as shown in <xref target="reflection-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P2) toward AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="reflection-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> presents a test case for SAV in provider-facing ASes under a direct attack scenario. In this scenario, the attacker spoofs a source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources. The arrows in <xref target="direct-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The procedure for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider-facing ASes in a direct attack scenario, construct the test environment as shown in <xref target="direct-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P2 and destination addresses in P1) toward AS 1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P2 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="direct-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-domain-frr-topo">
            <name>Inter-domain SAV under FRR scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment           |
|          +-----------+            +-----------+      |
|          |   AS3     |------------|   AS2     |      |
|          +-----------+            +-----------+      |
|               /\                       /\            |
|               |                        |             |
| primary link  |            backup link |             |
|               | (C2P)                  | (C2P)       |
|        +-----------------------------------------+   |
|        |                   DUT                   |   |
|        +-----------------------------------------+   |
|                           /\                         |
|                           |                          |
|                           | Legitimate and           |
|                           | Spoofed Traffic          |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            | (C2P)
                     +-------------+
                     |    Tester   |
                     +-------------+
]]></artwork>
          </figure>
          <t><strong>SAV under FRR Scenario</strong>: Inter-domain Fast Reroute (FRR) mechanisms, such as BGP Prefix Independent Convergence (PIC) or MPLS-based FRR, allow rapid failover between ASes after a link or node failure. These events may temporarily desynchronize routing information and SAV rules.</t>
          <t>The <strong>procedure</strong> for testing SAV under FRR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure FRR or BGP PIC on the DUT for inter-AS links to AS3 (primary) and AS2 (backup).</t>
            </li>
            <li>
              <t>Continuously send legitimate and spoofed traffic from AS1 toward DUT.</t>
            </li>
            <li>
              <t>Trigger a failure on the AS3–DUT link to activate the FRR path via AS2.</t>
            </li>
            <li>
              <t>Measure false positive and false negative rates during and after switchover.</t>
            </li>
            <li>
              <t>Restore the AS3 link and verify SAV table consistency.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>MUST</bcp14> maintain consistent SAV filtering during FRR events. Transient topology changes <bcp14>SHOULD NOT</bcp14> lead to acceptance of spoofed traffic or unnecessary blocking of legitimate packets.</t>
          <figure anchor="inter-domain-pbr-topo">
            <name>Inter-domain SAV under PBR scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|               Test Network Environment           |
|     +-----------+            +-----------+       |
|     |   AS3     |------------|   AS2     |       |
|     +-----------+            +-----------+       |
|          /\                       /\             |
|           |                        |             |
|           | preferred path         | default path|
|           | (C2P)                  | (C2P)       |
|    +-----------------------------------------+   |
|    |                  DUT                    |   |
|    +-----------------------------------------+   |
|                        /\                        |
|                         | Legitimate and         |
|                         | Spoofed Traffic        |
|                         |                        |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                          | (C2P) 
                   +-------------+
                   |    Tester   |
                   +-------------+
]]></artwork>
          </figure>
          <t><strong>SAV under PBR Scenario</strong>: In inter-domain environments, routing policies such as local preference, route maps, or communities may alter path selection independently of shortest-path routing. Such policy-driven forwarding can affect how the SAV rules are derived and applied.</t>
          <t>The <strong>procedure</strong> for testing SAV under PBR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure a routing policy on the DUT (e.g., set local preference) to prefer AS3 for specific prefixes while maintaining AS2 as an alternative path.</t>
            </li>
            <li>
              <t>Generate legitimate and spoofed traffic from AS1 matching both policy-affected and unaffected prefixes.</t>
            </li>
            <li>
              <t>Observe SAV filtering behavior before and after policy changes.</t>
            </li>
            <li>
              <t>Modify the routing policy dynamically and measure false positive and false negative rates.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>SHOULD</bcp14> maintain correct SAV filtering regardless of routing policy changes. Legitimate traffic rerouted by policy <bcp14>MUST NOT</bcp14> be dropped, and spoofed traffic <bcp14>MUST NOT</bcp14> be forwarded during or after policy updates.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating protocol convergence performance and protocol message processing performance can refer to <xref target="intra-control-plane-sec"/>. Note that the tests for control plane performance of the DUT which performs inter-domain SAV are <bcp14>OPTIONAL</bcp14>. Only DUT which implements the SAV mechanism using an explicit control-plane communication protocol, such as SAV-specific information communication mechanism proposed in <xref target="inter-domain-arch"/> <bcp14>SHOULD</bcp14> be tested on its control plane performance.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating data plane SAV table refresh performance and data plane forwarding performance can refer to <xref target="intra-data-plane-sec"/>.</t>
        </section>
      </section>
      <section anchor="resource-utilization-1">
        <name>Resource Utilization</name>
        <t>When evaluating the DUT for both intra-domain (<xref target="intra_domain_sav"/>) and inter-domain SAV (<xref target="inter_domain_sav"/>) functionality, CPU utilization (for both control and data planes) and memory utilization (for both control and data planes) <bcp14>MUST</bcp14> be recorded. These metrics <bcp14>SHOULD</bcp14> be recorded continuously and be collected separately per plane to facilitate granular performance analysis.</t>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test follows a reporting format comprising both global, standardized components and individual elements specific to each test. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be documented in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2026"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2026"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2026"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2026"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 1088?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola, Minh-Ngoc Tran, Shengnan Yue, Changwang Lin, Yuanyuan Zhang, and Xueyan Song, Yangfei Guo, Shenglin Jiang, Tian Tong, Meng Li, Ron Bonica, Mohamed Boucadair for their valuable comments and reviews on this document.
Apologies to any others whose names the authors may have missed mentioning.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+196ZYbx5Hu/z6H71AjnrG6SQBkNxfZtEczzU3mvVww3aQ9
mqGOTqFQAMosVEG1dAtmc47fYe4D3Ge5j+InubHkWpW1YCElWcSxqUahMjIy
MjIy4sslhsPhtYO88JPp936cJuEDr8jK8NpBtMroz7w4uX37d7dPrh0EfvHA
i5JZ6l33Hi3C4B2UKyfLKM+jNCnWKyj67Mnrp9cO/Cz0H3jfhEmY+bH3X2dP
xs9PHz357trB5Vy+cu1gmgaJv4Qy08yfFcMoLGbDyfJyPsz9iyQs8D/DSZgE
i6WfvYuS+fD2MRYroiKGQg+NX7wXYbFIp2mcztfeLM28Z0mR+cNpuvSjxIOG
4YMwkw/O0zILQu90Os3CPPf+5MfR1C+gCcD4ZJKFFw+889M/UQXXDmI/AZ7D
BKv2S6gme3DtYAhSyB941w48j5vwPEKBJPggzeD9/1ykyXxe+klQJt5zf5Jm
fpFma/w9iIo1sh/9BTinB2kJ7MKzR4so8YloCHzGD7w4CoDov/11HsT+ZBRO
y1GQOCp/7EMdkar7dQ6EF6XvvUmiizDLob4N6y1SlEjyb4Wg1Fz182gSYeXl
HlpuNbyMJ53tfg6VgHzm3r9H+5C8Uf8PURIH1fqvHSRptgRFuQgf4LtnTx/d
+er2Xfn3b42/T+7dVX/fvX3/GP+GAQUjxyQQGVo6XOX0zPMKP5uHMM4WRQHP
bt0C1fThveBdmI1wjIygmbdg7Nwyho0YMTbBLJ3E4XIII7sIl2FS3BL0efw0
DgJgyx4/L8PiMs3e5d43/so7Tfx4nUf5wBszfe9c0h/QSDsLfyijjB7kHtcI
dKHCk9sn90Wr1VDcV6sNgru1WhuJ/bdad42fBYs99zaSjIowKMostJts9WVz
+w/B5L188vrIOzUodXfg9k25LJ0d2NqQHjZ8g4aASRkOPX+SI5cFDvDXiyj3
gMUSO9KbhrMoCXNvqWaXCL7h/GJOS16xCL1VmNHYToCndGb1NumH2UQvZ959
wfuFzfsR1Bcs/CTKl/kIJyLjuwcTq1cWURz9NZyCmfbmNMUWIb2XlTHwB09X
MIdhAyoV5as0hRbNWWUX/kUIDQlB9ZermFQXaF5GxcKDdqyBrSxKSxBHmEfz
JMcBkmbTMKMKuLlUKzxHCQRpBpWs0mSKMsmDMMHy0AJbpoX/DnmEAv4Kxqof
LFBeQZrkEdDGoj5RnYYXEbAOdU3gVW8Sgx55k/THAfeKlPvSmPhR1Mh6gjwu
/MKDev15kuZFFCAhfl+L1mYMmLkAFnKkQUSpo5ehn5dZQy+HP0ZAG1kGcSbh
JYtDCpN6NB+hWqGaLaPpNA7x23Uakum0DNjnuHZw3qEQwCf2QTSLAj8p4jVW
kmbgsRU9envkvVqFNAmy+uTlfB7mBevPNFzF6dqbRrNZmCGRir69fy9muQ8f
+O/f8t8TPwcCKXV9lAkyJMeE7SZ4TBdRliZkEUfQYuQrwjYNtNysYXLI307P
j+oj5pC/4W8VBkmLhdFnHaXGQj3geKbg/ixCf0oEsaAfBCWM9bVXJqjJUsWV
tkIbK5MytbsyZX34MKp7mBa/bII805bVaOOPderiKXYUtGqV5mKcl6CbpIM8
HFH9UOPaDU1FVkAnT+MLpiNlhuMgtA2aOajYBk55lJvj5TKKY28RxishcVQv
MkgFjqAU+Gdhg20C1n1kGKtFlsxhdAkOnKVEUBeP/RxUyAcWueOSKdfox3nK
1YLOT0WlBeiza4DWxyMNalkd1TKq231lCsi6gvHomAAsMZs8gGxTEMMiCi90
52XQwlmZBKyj4JIObJO39NcwRlc4wL1lGRfRSshA9ySUiOP0Euu2hC8EBszm
QAcMHxCKCnqm252vwgAtScNItUUhDXPeaZazcO5nYOHA8oiuFppKWml1Aqsc
dlrerF9g6bCOUgwAJXCbEWyWNbAPo1E4Gngz0BNQhxSCH/C2iQd+lACb9Ag1
Mz8asEJmaZEGaYzNBZMB3R5YusQTJvwIZjv2VhASMkn0cMRXePUS2o9dYivA
IsxlW0WTxO9sP32wYNn0Esd7lpYwdlEb8nRWVB5BFxUlmLQlqhPUd/inF0cY
DxXMH4jBJwZh2INh0z9cLiCGBM8AJnDf7sRqV0NVcYO74+c59KtUdSF6c6yx
eJYrn6ZKaVarvktNEXjwXb/ufZNC08hIB6DQPCJbjJJg1dAa9CiB1QDUJadp
6TL10slfwPRCX2OMce3ghneqmvGbePpDmf6eheMcvjnUnhe/yeg9mkiwX8C4
gXJfhnE8lIqrPR2u5IXlMZDORJOSbA8UzcvJMF/D7MvmGF9B2jCciQ3x05eW
cA/J5r1L0ssE+1DwvoyCLNX4iOD0SJszPWJCcChKVHenBSZ3dRqBxUP9krb3
MBzNYRxJ5bxVUcn8yJusjS5vdCB4oq3ZHMs/BKJhDpYuZ8M+DaGCJSq5chNX
WSSdXMPVQNXMQj8eAsl46qAttMsKzp77ybz050rJ3oVrD4pNc++LF2/OX38x
4P96L1/R32dP/v3Ns7Mnj/Hv8z+ePn+u/jgQb5z/8dWb54/1X7rko1cvXjx5
+ZgLw1PPenTwxYvTb79g0/LFq/HrZ69enj7/om4IUehsb2luB28PfTc/PwD7
GoBu8TB4+Gj8//7v8V1wJ/4J8Yfj49+Rb4Fffnv8FbpsONNybWkCLiR/BQmv
D0DEoZ+R/wx6GPgrmDNinGTAM1qg1i2ga0cHBzf+CyXz3QPvD5NgdXz3a/EA
G2w9lDKzHpLM6k9qhVmIjkeOapQ0recVSdv8nn5rfZdyNx7+4V/RtnjD49/+
69cH7K+/Jn0k20PuOqjhIzEZjNH6PyAlxsf2HEGzJ1p+GGOgxTCu0DxFSRCX
NFPAVLQQQQ8b0GWZgItfCEs7zMLYL6h/BXAEc6dk4DHOPZXajfkoB5/AGPAc
GmJwR1GSGo9UAuLfkqNkZENEH+h/YXQG3KZLmtYwki84aJADFJQIWyBCwmVU
4EQ0jfIAzIYXFSMPuf0jOCLDmR8gkTOyHg+808QLp3NpTaBIBqY6RosC4Vsg
ghPfi/01eLMn3gJ9GTG+SQKPSmjgEn5rp1shl6TJ8OE3Yy8QpZXJ4ImAOwYa
LmwciYOmThQjPiQ5CGeB+Dg99x5yVGxwYDnlghPBJ3gA4Y/wHaOT03MiwQoG
7TNAbGG46PF5WJQraa3I083xCQ7XuQDYwVygKY4jnxxziOApVkMUEmMVLPiY
J+03ZI6J7uHjN6+PuKglJPwRZzs0FNS5UkhyaoC3ULFJaUgy8oUC4hURiXN8
y6bp/XvkOoCIcZiHgWRIVIMxugfimqE2CjQjzSxoQ5EXb1EDVegmgl70xkKM
KtnNF0FwaMMWIfjTSBGZluExamwczsFPXNJPXAkEeIQfSPtAjV6k6FfLmik8
y8iNIY9XFMSe+m/1Ydjp5n83fm7yG1fcJwJ09J5ol9y7km8QpaH1uUkPrTf4
X/W5km9Aya+tN6D7zTeQ2rWDqw46TEu/5eZHv9XWcv1Wx8d6y1lj7S0n91oS
4kOPhSLib39QVFmiblrmb1Vu7L5//8C7Tq4/DViCMf/lCz2oR198QG3h8UHv
wDQNQW6JgKR012jEw3ibRfMys2LYpjE98v7Mdl4Vd0R6A6YAOiCiEnIutCmg
Qcbj3xz34G4VGPjhIAiLNSk+mSTbnxMyxVAW/oIANMoXUIrNvDQ42BSqRnKC
IzFKYKAvwGjOF8wSxEmRn61VuE70YZTjdCUCTYPeBFoaMqKwxPE7jYKCplDw
VotqSMFRmRlM1E3V0zJDISCgwSITTVv43D3gLfmTCMN4huLI7EFEDM0iCxMy
+FS3MGSPhGOupmkVxoqWoVRwDgljeDUxxMyOXMRhMsUHylxK+n7hkSPDD1MR
xHFk3hW3GvWPEGOPGKyI11ptQLgZ+9U0b0jEAuz/HImZCEfEyqM9WvAySWtA
9DBvhcpRl+bvtZxIsJFCyx9ZQ0DOhxzhp4QHGzK71AOgNjlhz/kEevugi2lS
DYhIw7Jwpnrc0hnhxnnk+AqHQfajDPqQA98Ab+I04HAX+jdLRR9ovmBY5uR8
0SSeheCY5wRaUIehxEayudIHsexBzmCgPUnj0OOgjFWlUFGWDOPZJYRwKkwk
kguzG5ghfCOXEyBOokkRgZOBwz+GOAG8NI3m00QqfUnuqApvBOBMRURMM/Sc
mlMbWOA2M6qOXifTGoBawRfosIXhQ0pQJKj4gPI5mBFwyyY86wv3i9BR2zET
Eht4IrwAGyjbF02xzbMonDJyndiwKjr54MUJ7ShzRE1wNMTc5EW0ypUhOj0P
hbLQKJ0oeQFtFMOfEf809KdzaHKtFY9JGzhpeUE21CII4EHNIcj4K78jYQtJ
gCYa0QHIPoEUwnBBj9ftllYn8PNtbMNafwg5gMSxzgMTlAs6iqDKgCQ6i0sC
2sgUCASALQHp59gwR8+SKcZFUFbBG7mw+DEFWBjFm/YrUgW8w/89fpYfSTgY
506JuNQnAzB28BwLGFrBa0ASbarYA710YIXj9YnkFLqf5dxEfpalS2OconVc
lYVpitkI0FYdDLBwkGiPAMl+KYeYQlR0PT6uYaMW4Ms0yGbRj9jLKqSQaCF2
DyExBN5QdTzSCznVaZ6+zGu209RhFDkD0GBRgacF8UY+RCN+Sj9Pwji9NLgv
eMmqzAXAuJZmjKnnoN/5Bi0TQhctyNHi6llczsaatHS5EMq9TEnUcs56SrDy
WCLNZ8C8mpxUhIACM4YSh9I5r1IivoNBdiZD4BgMPxkIAmDEWBTDG6cWwWGR
FmCVk3I5gfHupp+HvEKopvKnwnNclRkuLXE/GmDTgL8uwwKXPURYWZJ5kQt+
AgagrTO5hNsJyAY9GIpf4e0oZyzyZUr+DTSUvUmmyGrLmK4CG1Udk3URihqO
vBen36qFAOwG9DQEhpytWatpZwYYIpIlDJtcLpblCCdLbVW7d8QSMs9PVbzN
7teXcrmguV9lD23QqUZXCQEmKQgN/IR34fSoR19X6/zc0dt39Fgu/jwyFn9e
R0vV3QrTc60SQTeibQO3LSd4WaxCRqlyetjySJtiA4Q1kuQol6upGRQI7A4d
BeX/LfwEX14gdJuoYCASISPyBN7IKjdnlQkoXSIdZZuQVB0CkUKp2LJuwY9Y
LwLbJ+ZvmmFDGYG51s2sRV8TNjPF/gKcBR94GLNnjVy95gAQ1McYcvz2Ury9
0m8X6m1p1oUY9HONv6qtHGLkoKL1Al6Fi2x3oBWGWRJx9LU7vBLi0HAuFX1N
U8+ZnjNN+9NrfoXvYaYWmXjKLYQy+mRbuFthYBS5QYcicrmZhVTPaiSQjUOB
+1Ug594NfKrn+oZmVb0Bq18r1RrvGj2O/VrpcqOM5cs2dV9300SMZOyoQfNn
7xlqaRVQwn6gVXu5CABOtfhGXQKBdaj2axiTzjREKCZXjklLNWLcKtdH+IFU
KkQpRDBs12ol3SBU2TpR79OzUPj8b2hLmi/3M4klPf6t1L9VlPLR+A21eRlC
FLgGw48D2zYZYsnEiOdV9fYO8NeE5by/brndgk17ByQQfn+dIsHv+dH3uX8h
Xq25c8iewxOgMOTGjVdyhfnGjQfeExPMqew/UB6vYxNCBesxlJZQJMNXQAIq
NpMoDyMKEpR2RbgVcA6DfBW0SATDsXONTaOIsW+ZKzBgEWgvJu7dFLG3/gWZ
9OfzjJqYJkMOwKvvURc+4x6dpWIzy6Cy8mlNqjbMyK6+DpujahdbI1ZrjgBL
KUiQxDgq05F0VcQiCJGi0kvdeOKBwBeoaY4FawGwhC/KhLya6gumXuflRCES
qyy6MFc+6oQJj4CBPAS6Q1rPrdEWamXSxb5Ry0/VAmIlHnsNwvZlyvYr36TB
UlQYn0HjIxR+484AjVCJwDrKKmgqOcRqpqaJqali4EkOAWJyBOMVPBKgijuS
BryLBWFHjuFkOypdzZoQ/rhiDFzE4ZHw6eUglUEpcYdAc23Np2XVo/tz070m
0rg+1LpaUv0YnN1seKW1/JVY5/SOq8sidvmnzx7isJNrTBvX/xhbqz8vQWu/
X6Qr+PPWW+ajmUUsf3L79vGD6eS3Dx7cunfPU3I7lkW7yrd86Me3t7Yrb6wX
bSl/tXDXKv+PVX9f+duf1+bKrZC/9aypvBjnz8bGUBflzdVe+3e7fhjltjKI
8rXnRv27jt8W8Sn9aXqrsqLoWI+0iAm89/C8nEgtP7pqfNs7PLQafXRUWc/s
w4luKq9wWruaFSifrxO54ilxV7XtAv42/Qm0/9YEnq+XApmQcaqcq8WS6Y0b
kuYjgyZuNJFCQI+ssuHaZK2y0FpdG2hmAHnV2wajXO9anvEezIY2il0YMHTV
BkNa8MuNMrfqqxxy4ymub5LfrMyvOOZIXhiuG4Lyj7xTnO71yobY/BnmtJvZ
cDZOz4kw7Zbk4CKcRT/aA2JQ2xuinChjiVdsqcBmGFUjhOVPcQKORPgV5bKS
yBFHEzfCA1Oeb8f6qaCj4xVaqyl4bYAwX2M9hJtC61LkPvCMHiWTtEy0p8c2
RfSlbR38LMOdpN5F5NtywC7FZSNjlxDhKXZxhcIwP9Q1cWhyCJ7ulBajbZbI
PDb5jXYduofQ96J+4fM0LdpsrJcjss1NsswpOa4SO+PNCBh3EgC34hXe3+NL
WS52rk9AhkXJcdDS3EdQWRharWJcGA5/KHFhGbnn11QduVr2vHGDoqIpdN6N
GxzyimUOdQApyvuPYNoKzVEHbw0+po36oK7RbK33tBn7I9TiujrToIG5Vssm
tr/WmAnwnBSFWQNjM5VP7ZowSAyGJQoKuZzUYsmq+6lcY5bDMrQhfmWokpBP
Rmp9PbSKKHODy9RqJ0e9PcbOEiJ4x2JKii+vhbRSzQ/del7V8SNnDNyj8O0H
8KcgUaQIj9ijuGoA9fsQjgItwufDmIKiVMZVqpd5gQzDb35Tmr7qsqwRYnWu
s9Jo4kCKYt0ffQzPB2xJjh/8Dkv97sGxMUYqcUsuh0p94FrhDMUxeR1YQIWh
IeJiLs2iOXlfuNVD2jZTq7y9xkQV/6rZ++wdJQmHqGv/WY94pTctFTudtLnP
mlZTDOXgS7yqRmuNViWeoo8OqtinB68U/3G8qt6sxVb31RsywFKB0VtzNB3z
aLpvvGnTMl5RUtIxFoRbrlq1PBtkXwt3doh/a6FX31j4I/Pl6f7TDxpebKBl
xWIVWreaI7UKLVew9tYm1Ri02bTM2MxQjLcmqYZ3LL72aXNa5SnHz2ZRXUtY
520Y13leJbDzNg3s+kd2/i6hnf+RYzu/O7hrYeETR3e1gI0X2X037G4HcPXw
jHwrl/M26A72+sR0fGlAgmc8uS5cOGw/FCk2geYfMTS7zztRgxDCM3G0STkk
FApxpFWlZ5Ex7IebmJRke8jnEOz9euRnudWtkaPBVoXKyHvY1Nn4MNEeqADW
8WoGXqriKnHvCHEbmqgChwE5hGQhXjOgN/tuxKYktklT66UtjdwoOLb9fsVb
U3jcbhDMANk4GWOuRG4ZobaaoYYQ1XHyqmeUyoGoo04zEsXzrXhQQS63GVGd
PgzYEo763fFoo5HixSlpeWhdUFoPnOv9enThCloHFaOnPEQrfsWX2yaizgi2
FoHyvsSu+PX+UROY1YeA1uV+MSxp/a84euUTiduFr65TXDu4dFfqmNGhxGwb
kfcNXUX6UABwBZNss31sLS/WRLRLXnGvJRiyk0tNgXvzMTfFRwOLreE1/tSy
JtdWerclMXrOYeEVbzBnB09B863BfN+6rZh8f4uhu8t8y4XQ3cLdHQLvneve
ZgF0y+XPT7/4+TH6272K2Lc0PTJiYP2wb+laXNy79Aac72YV9RdH2B3JnVaO
qLthF9Yu66lVks0ht8lY07lV5dDK3YV8DquJb3ZSO1jXDqs4C1o94Ie7xpYT
ugqPdqPP2pdPBvoQKp4ikvuV8BRnqHYLO48Z6qgWfRPraKTl81ouLr4k2+9Y
BrQ8YPleA7xAnotDnPCSjPZoqS1dqXvEGuGIOOwZyxAQ0tyFjoN2W8Q56mDs
NotxVdbw14Z1uM6Qp2URztb/3otw7g7+9ItwW4Yw93YMYdqX4X45wcWGS2NW
R5v2vj6dfA4sdppMdwgxDLZ3Cjb60enh+/eiY0Ugfeg0LBba/NgvnbjoNO7R
FMtR+K9r3dF407WeWF9KBEIuQM/Y1Nm4lohy0SuStVXE6tqqQ861cKHPkrCD
Ti1o6bMc/BH54Y+9ctip0xadxlXDziVDSad9xbB7udDmp2FFsMdSYft4F9rc
Z8DvFkh00ekbDPWh0ycs6qKzVbv2Zef1l7agybVW2Tdq2mSpcouwybVSaTgj
W0dMrRh/Q8iEaxN050XAF310BU6+O3LC0Srv4VGhU+wHIV8w6gic7DM4ncGT
spjmwRl/yeuh4K7jzZz4X+Fz2/FTW4Q1MFd1dJTlCiUyX0Qk0C12MCXEarz8
qdeHPlLcVGfF3LW46XJRZQR8pMjp570S9DmM2m6NpjGO2nVdovbpu5Pwk68s
GH77vlcW6k58y0mk7pWFrtLNv+I/n1cWukv/VCsLn/hQVTtG0XqgapPjVJsc
pnL70JvxoBvocGH9+dwF+Xcerd4F/D9tJN7szwo+wZNlrLQb/e9uwn7WAeqe
h+a1p8O7AcZrgbTinlnf3t2nLnqQB9etSxW+zPtmLeC7YCrTwYauZnMnfFwn
s7neVph+e29T9Xm/3rMczZ/jQZmNdhpt4506nEt7++PnszL78GT5LlxhAqxB
bE4Le9rBvuudAsa8t+Pp8p6UOp1fFPmJSanpCEyFp81x7Q3OyXQg2+5TMh3H
ZJqxbRvcbjojYwQGDSieoSau787uacAVbe95S4R7rzzRx0K5u3HcnqdjumHu
nmdjegDdre63OhnTB+re47mYjlMxyNRP4Jb3PgiztVvuApU398s3gZe3dcwd
GLPynxwo81ae+VZ4s9NN87fzzfvuOncjyT+Ru24xsTff/ZNixr2OFOzmuPfo
WnevbnKYYDcE2X1+5KNvxTG99B2QZOs0wWc3/xO5+fXBb845u83OO7v6ytOw
5ke32+V8RRHAf7mV4GdfuWbboX7ljkY+98cBfWqnox2fyjs2gT5bhirvVAlM
/OBdufJWfoOjiO+sMtCnbM0v7Y8D97av5o+zG+sf57UE1vv748D96e7XDgJX
3nP7cs3NCZwLEyEjgY0JdH129tY7nHSnb9zfO+/vm2/lbc+ybIhOo/S2aze5
sg/y1AfjdhbysdLDp2dnR25XuvHtc/E2utGOn41bK+WNns/TdDV8moWhdxqj
H0qz+eHzp6dHeDZcJrAZPks4wQpaWneR18+GVEokHeYMoSG6UDhpFWmmHU99
dFne7+vPUPiUNAlqTdIp3jwbxXhV8sh7zJcQQwvw5h+8UZSybxchJg9HQ2MQ
kheK4wyboqsHUztmwKE5rJbUkK8qH4DrW3ByGE5JjrUkOrmMTH9I1wvTPWY+
eyTQEPgBSl47sF3oHFthu9B9L4ESXQttbXd43eitP/2LH2Afycx/FNEjNbx4
XEzg8qplaadJ7Idi9vr73/6Hc2/xfljyKA2PD0UYJWVa5nhVK51m7rjY11jX
FzyoM9sjseeW3UrwOubohzM/ovvVuWU5uRrbWwaqI0hc0NRgQQlulTd9optx
dyQy2tbuNG4Md6b68muDOImZ1DULjevidVvujfA+6UIqnSVlykvJ13ORo6aD
kIyuefVUpoXdnT+IR8p46qGJKTjbD5GmOifhwr+IiAjdNA7SqowvzhkJLQS2
1iNxd3S3zMSdsllIlm0ZJdD4WEoSa4BmJoW4Xlm3XhQTCQNAT5argu9Sk/e0
sw1BIXBksru3WXMwe7iWzmmww7tzlrkyPErH7Lm3evjT6GG078eehjMfLw62
vb3OPdz96qnNo9UWOfce80f5bLUX9lNPFWmtiU+WgdkxCtZDzgNiA7Jvq+Kr
1ENPqxhutSKrzM16IxyP7DJX9bBEPTp2uLjb1tMgp/Z4RHBT+9jPtvEc6z6e
s56WDJh1B69Ky+Uxbk611W1cTXq5jWNTC88EcnM4ftjuPraUMt3IltcIWOT1
cOl/TcMgygmeUrlx8E5NlWUe3ARMX6WANZlJR4A77BcMvMfnj8aUAI/A0Rn4
NEfwJ8jbn6r74GX2MAlVUdqPkXeOji2NS7yVE7E/mexTo1s49sSpSnAY4zik
3DGL0L7a3YtmYnrCuRVERpfp5Gkcqut0cBEuo7W4amrrvp4eCLK3p4fv6hwA
ODsKyRvAEohXXJYq0xlaopXyrgFn0kzIo5ziDlPlqvMxMUKCTLOmCh6ru0xq
1930Sf/ACUeQ+5DTVStojjUGK8bms8EdGX6j9OoIHO3ppWBX8AUusnq+a1+S
BWfx8Trxl4jcQ18v0yl6bCm6fEtwiCrccIrpCaj5hYaVwRX3V0VVH7Z05YR7
JJzvkDNXy5RcsygmWVdSaWmQz5VRra58I8Md4zyP2AC6Dsq0AGYyRxpFGGGg
+8gpY/H5MsplmrBACwCTkljJ5q3EiCKTyVCEaEMK0VTuk0qGErPL++QmGhip
6qnbnRmwzLIkvJZUTXaKRrqVCu+bIoeV4m0Zh2KEwlCxBJRlXIvqlK5EmnA7
1+hIJ5VqYhD05YfSF8k9JRJdzec1UFkyRZLfqcq8LGFd5LrgpYRKKq0SiMeC
rJlNi59k3OSpFSfL/FrjvpLLddJI0QRnOqyBTkym3+RUcpTUkLNiWVXmKukO
jQKPkYhKfje5kiOzZ3VrkrhUT52rqKVrwaErc62PvFd48ZouphLT5I7JhiNZ
mK7ARuDkVXjWWDCTe6WJaq9OzYLpvpTNN/N92QV1hZSIKXceCfezYPHhQz1r
JZSv35Vnps3sE5Q5XErLKb9pO0hX3hNe/5rqdMoCg7/y3FnPvzZ23TS5jxty
4XbTcMDNhygAKyk6p0kXmbvax7HQf1pfky6aM5GfYSzNNXLNQePVFYoZnc+r
iy2exK0fRbpo06haeQH15IFDgC3BlO52LBjY4ZFrW0edyJkM0kYm8lniFEL1
SkZet8LMlU3HccQdeiyzPKywSAkAyMwBDzjXTTP/kmWorx2k3IwyZRavqdeE
h15gD8+wU2Oalr+XxqTYnOHR7L8troywJa1OzQlxt8p6VHcPpThz7a6yuUYI
08ihxqZeCVykaLeSyVZWUo9v/zP6uPAPNjCF77f/2Vy/bhYQXYISB8LiqCCG
Lo9M5yJoQFwsU7mEs0Jkt1DTJFLK9WRZG0bEyjimxHeJmo7o7No344GRIk81
W8kKnmBMg3mQwdMPL1gKeRize4So7KIsOAU370cQAVTOyVQxiuL029GsMlTs
XsN03tLwymTsYkqmHzXhAachM/Nk+CqSM4cM9ArP1aIjVUNIdFG+EDYCpJku
47WRQzDFLH6yQvBUo2UUY4Zz6mRLNsozoYFpywJTHpts0+inIKpYiOyC8D+s
S2dC5Iyv/K5I/7CC/vEpUOEUudwP+gxnOJshYinwV2wgZkzH1CYrjlFryUD7
O25Z+EMZZeL6WCMQFMrnyGLKGjcw2Kkn6l6mNOjxzxhXjX4TT38o098vQjAy
v8nob+XLmRfLKgIVTZFRKncgmRAhdeFWKpdR5kpBzwKsWgQajxneeUNPQgaN
d+wExi2uoikwHtk84GgrYOTLEUdtIOYV14pluX6gC9GupooPy7OYFGp9OpSm
ghtpXLSkLlkVGbZVNSJTYCHkBz4ZrqaJTJtiKFalyOGzL7tjnqbTyTr0nN0i
JmMtaFR91PqB8CyxApICBnVqKiAXnCgMrAPBSowKZqFaB1IhyezRfJVClTAC
41it/PmgTxkBDgFe3Gu4OJwwZXoRKbfz7Omju7fvH3/4MKr4Po5suhUX6LWc
tyxHqyPkUJdH00hl30i7TY1znbAOtudGUJLcdSavpO6T3pc12U5lK3Vb5wqt
hTTW3IlZAmVTwsTKxVoriF2bzJWbhWZYzIn2DEtLVJi6MMDEhR03BqurvHgy
zPWNuLXITMZhtRhN0rPjSVNmoKibOE3dAWd/36k9Q/MGjlRtY+B+vSi2Eg7h
FrTHTGqGkcXT5R0NdlAFl2PVLr1NvazNVUr6jA5VSmWS+I2HrOKPADKZSLXO
oIuuiUjYTJI1Lvx3YVLJ9switcVW4GwlsegJ+khgc6V9Z3qyIcSNhZa4xHHd
zmntAuXwJvx2RM7KGdyZb7qGx7UlAq/hcu4M0Vb0+rgvPSeMJl/EUSNzz8mk
2rkHgxnKRMG7WAUfvTOQW5y5mXeBYpVE2DXvsDI6dEZ4LGWFSVUs7MaNrnzt
jknX3kbeu+/sqdfAJlrto5h5xWxr5U93T6Ci6l694Z5PaVvPMspzcribJ9X9
TaYDBhFmccne5aKaJl3gKSbKQhFnFYblpsK8g42pT9Gau6RcTjD9h03aKm65
ysI+VSMN1c9GXNs+VfdWlz5zdRsx0yP6+UAePSZrjX/kLRq1IeDRS1Lbzcob
qRKVkN3dNCXbBrg+8YIMCqkC2/JASm1F01oaYi+/wUXPmbfOhqzcmoZr7bfM
8FNt7Tcxvg3TyfZWV+mssIe4pugtox9J9MYRh4ZTG4iMsOzUovS2xlFfLox6
IDe0ITuzEtdnV+lK6Kzq9Y2OYwgciqaAqbySy6egKew4otFiYuV8rOdhc2lW
v6ewHTHYzY6smFVlVJtMarNPsYktrfgbP0cLSprYQwtNrJ8BLlSZLSLd/Z/s
qVvmqtx3i5K03pnnfvR7Rm1NhrjmeFq2eFnmRUsQ1FV/JezRdbFcHMLoYYFl
VfXYp9pcj4MfvonLXMKlqCfMvudH3+f+xQcZJ/Eu2LG5v4QfvZT7S84wDujY
slDZpZLJPTGOrSr20nOYmBFcx44aP8jSHDspLqJVLOBz2U650iY3f+RmSEZH
NOVSNS+36R0z6upEcVLz9FxMrQL9y4a3VqH9MyOfidAotEF0FQqBj9M0KMl2
iPV9x607XLmc9yTGafaYFWUmety+oU0utDhh388ozy3WpSaWvKVWS2y34B1C
Qq3ntA3MPo2o0ZXaOUU+XBjp84VxiieCT8+9CcT0dKgwSdISuCerEv5Ihypi
2Ul4PiHDXdn6/LGj8tZKeeEC+0Kk0Evw2u+AJHBYJrhF5ahGwrB70MPLlMdZ
volQpDjZIoLWBmbpWj8ax1PQjvKbIj96lJG9nkQxWmw8Co1nG9WUSrNKExfA
IM7taZmLs43Qtpe0334RxrQ4Qiff41gC97JZFe1g5ans4WLDVN+sxfzBw+Uv
4o4/r8ZUbYd558Zv0Og7h+M7R+pB/7pv3norar319uaGdXvmXRjOne7tpc1N
2K7i7XfdWd8cxdtK3/IOH52Mj1qK10s7dhTfbCreuN0a9PRwfPfIfNSx+Zzr
xt6B+uhf/q/uqbddpZWw5HOrRNNG8fHJf4FenXynJO3m1/ri2JqufnKXML/U
SutuujJLjO8hZ/e+s79g6YaxZFJtqB1Ly7F0cjg+McbSlTc+xjqOv2ssXqnb
HEtQ+r4q7S5ea7fBltfwg/6iSqt6NFt2aVdxo7+PK8VrWlQvbnPOnSXZoq+3
xvjI+kF/qbW72RQ6fqmVvqLOOz4cHw9AFuaN5LJj7x2O74mO3XPd9BFnLjrP
5LZczt6v9N5vwuRK63ft1M/wVN+wjmQI9rppaH7Urj7wSMy0Sk35oS33N0ps
X2aL1NAmPbW/r8LLhw/aS264Yd3JnrjfpD+T1STHqM707wn9e0fFgexNokYj
E0vtu0tH1UAHjMjfuJpcxhvCfwJSMThm8YjqJNzCyFU94xAEGTED9N/zk9rL
Jn5ESzheUWZJ/T1s0u91SxorvWOjAsShug4k19uaxnwvxphS9VaZHRgpjOP1
iH8vRGi38ue0jkRbEDnusYgpGupKHylOQiP8TG49UbwQ8CAbcCwbeaK7GLuL
+9M6JcDedHeCYsGeee+X/ftdlCfWPfwaKx5+DdzSn3dbb8MhbgvjUh/R0C/z
WsTXNCw3vIzo414X2jIw9UC0xufWV4bWrEa3oLEL9V0+VIXj+LaBx+kzSX1s
w05XibrPEKm7gXbQzIarRZsp3uumaOCNF5EvxtrHwQ9ZtT/pzUD1jX/WtUDU
2voRCwbt17wNciIyaUzR5JPy4bVwZL+b9NeyA8hZ32NeeuMbLX7obdhs/dBQ
2rvKrewdSrJYPW3p5P21HNhXJPc5xu+u+3OM7y7+Ocb/HON/6hi/HqRD6Zev
vn/yH+NXZ6+biusYvxqkbxTj63r2GeMf1X/4HON/jvE1P84YP16ttorx224o
1Yd+laaLwL8t7HcE/cDcfoP+1ntVk2kZVNiWh0eLte0p7Q8i+OlhAdooGMaz
j4kKYDQiwQDVXpeU/QI6Z1LiWvsqo7uSuEehHPncEiRQJzrosKIACox9E+YR
qSaWTEjBxZWbF1nAwU4kkAvRVIjjTlFqHJkYmwcQqMgr7Gs5pYm8XgMZ5+jp
02ETPzNU4ie4CFkpwFBahH2kzauZtZ8XKoE/b5Y/79eJS/z9b//nMzChZuZf
GTCh/bod8AntWna6n1td3XaarAM8uHiO1/FkN7GMwCzAVf/KkSC4HycQZ8lw
uC8njo+RCdzxaSUyvoMh153vLBxEPexHZPxVjQZR+apKpU9zbGBjM1SkqWf7
gCP6J/rXgDmcmEMXJybcYYEd7VAJS/4uTy9SoFf1km2ICcu9QsPZiibgpPZp
BVA68BNNQ3SsiaNoIMT8WxNpHKsWnOL8G4m8ycVYdaEiNi7ipuLkRPSqJnLf
Ca68rRPBjwMjqVoPF5WKYOtQSet2iirSUv2YeE0XleYufqu7WA1iEzkx/m4i
0mycHb80EbkyUBQyzxvhKPvkhD67oSoW6x8bXNFzcVsV+E9T6qIeiIvVoDbg
Rb14+GQ6D8Xc25xI1Fn1+Ct5o4IvpnARE6IbFeVm5g0KESdrDr/Rn334zXhk
eHcDCGXi6B3HZ3Lzq75eiGKEXMVYcg/y4NoBfRcxhH2bEjuFd9wL/owBQAOA
QiqX9PmqFDbw1o2GfEaCeFPxM/up1w4cmwgm4TqVy/adcNY0z/rCWeR+ylgS
zxKRx+zRNYZ4Goo2Sxw+Pj872n7XCrCzI4DlPWa2WKW8M4Otj7Bh5R8Njfoz
bizBC1ipx4kDjFvp4BtdfAfqiol/xICrpG/D2PGrQe1QEgIwDJwY2zkQ7BFU
WINkjXdkcMaAj6yY2Cwh5KdN10uKZdB02IUF/GP9QptlhHriTXS0EZ3DW25p
Y6AsMaavQGFmgh0k2dS8Y928E+teMY79uwGorzoCc9CQWOyr53orlY5UhgWK
8NSdTZ1sf2qO7T1FOL55+FF4zRejirPqEq7pD4HBUP/omBfWsUu2r5rN+/mh
W6YYG+GszZCsTmU5UqA3TtL17t8SI2oBgtqY7QaJjvcLEvHs90sEiWxnbZ8O
qsst7g0dVd3JFh+7EzvahJgMRlp3vmzAWWWLxQ7E8CNifjemtCExM5/vrsQM
kMlJbRNiFtjkotaDWINKuJGrLmJN4JMTwerkrAmEciFZDmICqeA5oRmMciBa
TZxVd+Q4QKk6stVETH7awKkawtVBrB2ksjbuYKBtQQDN1sENVln4D4X+Mha+
Gl6J24Yd3JrP3dSQ2OFpUeAhxuzoCktIU1OFwKxKnNSYs8Px8Ze0i+kPV3xB
spszY5uRk1qrzIThUjIzdx25qLUqbRVeszYhOag1qEYPmM1BrYFYfYOSSwWr
1LqV1gW7WWCbtWGphVjb7OYGvRzE/hSBU7CUKyT2ZiZj9cQBw308znb1NUD7
JYokfRw7/8X4WF4L4ItxZ0TeSY73VxIQQ04dXzttOe4a2dGHqrthGZnFakto
RhQnD5u4zreHZAStynXh2yAzNa5cgIw6d+9635XYl/uNLv3FaC2X97HorRYK
94C+ERe9qL6kwtz/F6Te8L5ORI/WUayuU7hjwiAC97Ky1pPS8yW/mfa2WRfE
zVtUFE/+S1pWVGWlblZMggIayFJD80XDsgzzrriiC6MjxeV0VMsEhYqEsjDm
uGARrXJ1xwnvNCNchlqcy8Bc3t9gJEdZ6EvS0QRUL+Lxg8Jd2oxPZeGRdzrl
8BpTqgwUGuYsvFEGnbpaGemF9o4X8M1AjZq/G3SgurQLPjj5BPCBvBdfIE6N
Y70JS9hgA0vXlph7KrU3MbdXMKFhw0kbs598t4ke6p/BhGbXgz+fwYStieHn
M5jQj9hnMKE/Z5/BhF8smHDvM5hg/+Ro3mcwwWsHE6w3f0lgwr0uMOHeTwEm
iI0au23z2BlHYDq77u6wuOnAD+x3Pxp24FeJkrZSgCKvKi38bB4W7TADcBwt
5f17F2EGgWa8JJ9d7kOCV6miTwcZqD77dSMGtiJ9dLTAreM7bjIQPfmzAwoa
BvQeQIKuUy7HJkhw/BODBPe2AAl22WkgB/YvHh9wf3pvVbU/7E302LhaL6Y9
6Y2KeXoFrm+xLdtmVHnr7aYl9rP5WBwF6oe4vLUCyQpTvU7PbHUcqFECzQ82
INMG43xSbvZ1QGmnI0oOKltd1uKgssWtLQYVfQXkxte37OeoUjPEs9Fhpe2P
K/WGdVoPLCla3YBO/yNL+zm0tOWxpZ0PLlUXync6urTHw0u9IBEXnZqI93SA
qQ8MUqfT1uF7OMS08+GhbtjDDXrsiZs9zeH73DVxxwV03OkHdOhVxiHXM6xd
zqLSDOy+ZWIsSN0ahy68w8FMv00TksVaIoQtd04UDpBibxscTro2OOwXnLAO
cdBRFIWNUKYPqjPnrOq6jQOIMBZ+yXF8C4RiQh/u7stCAVdtiHxI5CKMSH99
jUBg5gAPhZJhgsXQOtUjj665IIsaONICrvwjb6gwB3V9xOx3u4VTJ7oglDu/
rr0WJyaMciJvC7HPlf08N13c2XHThdti/IMhKhuiDRsiKBsiJwoxOemFmGyD
lPRHSPaDjPRARLZHQnZEQHaELHbHGHbDFnbDFHbDErbDEHbEDnbEDDbHCvaB
EewJG9gNE9gQC9gRA5DjasvYfw8x/66x/o4x/v5i+51i+i1j+T3F8J8ydseY
/aQrZj/5JDE7e2E7x+vObQnv31eo99hu4Kxw8+0GrqDbsTPgZKudAVbM2zOe
rUvi1x7LurPh/hTL/G2Kvqf1/1r3/+zC1o+38n+y+cr/zyBkPfkEIWvdKPwS
w9UtZ0L3PZR9lqPtyfuma85u+MEueUVuwR3+25zg+YcT7Tzsr0763HJHUNUf
HCUbYxz7Byq5yqKln605W7b9wgT0rVzxL46SVdJ2CoyGH1pX5Ro/NyslXQ1E
LXW3eD91Oj6NPdRZsi2A7yr53M4JvUHJczFGXwu7Zpbcfny2VCk7v9cNca03
0yl0qgk9qtGq+rDmHfnDWZYNi3SVSk+2liCcvYunZ2futH7653PxM64nWVSe
4jVZZyHffn4Irx4ZuY91JumH34y9MTvzz4DoCuZNtGMwTYP/CDM7TDmH42eP
jnBOeDF+fj7kFPFAb4DZjNNLsO2rCA1+FKPLaTmDnj+jRPc8hoFCkk5DepWs
/msy+XTrO2djLsLlKs2gOWDWYSpeJwFEBwmmWZeXZUcJ3X7OEzRon/KPNlp9
MOXq9tS0m4LvAhGS07NHRu5vos29CpMltjBnt+iOdyis2pHwZE68QzZmR5gS
HtVF+kLAWJmWebzmhaqOfOvixvpj6YrQ/WtMkR2gLJrPSeBCxpJdYOrvf/sf
uhgfewJvZMPpU7pV2Ea6aYvdmhNN9O5IpZXvO8uD5FHa+Dt3fw7+S7BA5dCt
B3f8DLolFenqUWjEGZaCF6PZmroLL5cipyCPYPwlwXp3H+vFm3P0faDjcJQo
0gXVN4vwLjRkX7QCJcMKitL14WW+Cm6Vxul87eF4musc5C9fUeaBKQs4CFcF
54uvZflGjcKL6SAKy3H2Ix+G9pTP7DwElHl8H86Mw5HZxInZxJfQpTbxXXat
iz49fZbqVLWJw2L+iChImOFaNo0e/cM0nPmYtx0f10pt5Khs5zA4GuT2TywH
ZX/OSbNn0rGBrcG56CjV4Fj02Czn5nC78dVYlepa5zt9vJA+PshmHshq0ssD
GT9s9UDwZ9sDsXMBGVgD+BxyEgfjGQURmE3phID1QzCIRhO6HAORtGXpr6AY
4YacvAYLcdyKkqBBl4dyITvS/gvMp2h0F2mGM8GQXhS1j7xzrJV4WA+nGUxc
iXnJJeIwPoWX3gK8G5w2NAiDswm0mwJumuBWqzgKpxs5H6ZIu5wP35bZ2nQ/
DsPRfDTA62hr8qM7Efkr2V9kBBMCRzg41PW/DKfJqZDxpBNkByWg7wQlMY+E
PwCuwTcSpOrrrMAbwQLJ0/kdIXgWsRBjmaivkjuJ6bya8NWe9uw8CRf+RYRH
gsIZ+hDa2RByEhOzdjjQkUmn6FfQ9gdbqtN14i+jAIFBIrXczOXZ3SkR/oPh
lmR8I7HV6gxqzaYxArC4+8xug2rxc1fqEhpPlHdLvE5uEPorE9DoLF2twunA
2Y3mi2KYwK/CPUIw1xR7uZoqgVy/fp083CyNvXHsJ6E31rmLpMRIJuJGZTV0
BqITME0OA58CbqLmZmmRBkAzMEIUMysSpXGRLy3RwZoLrCkn/Ml8Fwc7DxMY
MHTiJfOHATM9XCHTwzwMEAi18bOCNrzNeEWDWkgvW7QNWJvXRMSPuW0i6dJn
0IdX49fPXr08fQ4qjzeP62IRZsEhC6qMkQrkBKQGrQDFQ6NaeBb30nAGHDFJ
sejwD6gNlWUwgyu7oK4QbxNPMQI0jgiJKcXPgsWHD1KXJywlukmdUMJGSSlt
eewX/j5VZYr0uDodTEB3w9hc1FTGeNm88rhLWbCYpSncGIxuGLB9U0Rx9FcS
I/5Et2UbPJrBJNlHQZaV41BU8z1//z73Lz584KiypkSHojsq787KJOBVj6hY
D7xH4zdeqVnyDlXFsn9sYeRHQsLLFOKUDUuS7ZigzMGggdmQIb/sL60q8g2i
pWJiJDhBJY5jtqp5uPLR5Ma0G1R0F/QGrodA+9DmzSFKK2M/q3SwH68h0OPe
gc5ZgWdA8R0pPD594sOIIB0TszFtCpTv8cDAUQGhfa4ms3mcTnwcThDoTVFl
/kpNWK7ShNcPqaOm0UU0LWGGDuVAVkMOsWxZMS9gcO183Xbmg6RwK6v0JGx8
XCEgenjC+EB+cyV6mXiAR6xaBOKW8cSF3x9ZdMeqZnJJPLojPaTL2y8iEOcC
mnrpi3k3T2cFfcFNt4jacxHwFGRsKcNl/uGOoKXSbskUgKIgzNTnazAcy0pj
hb+zWkBHorsDIgHnskC5Ln10MELS7oHQ1QGM1wC0DfzHFfortG2Y6A547IDO
4JiGGBsGxhHXfW/kPeYmOut25KhT2xIEhfsju0fwKarcs9OXpyhlXBMXqxve
++v49AMbOCNJBMgXszhwGZ9GL3kySOc8DGDqLdZ1Wrn45YM0mBOYGhdLPyNg
gWcs4DoGQU1reSn03uxErLDYJqoysWETWRfU+XdfmoEYCMT+JAXO0GQYIcBA
2A8C8vIUV/WmcllaO1FJRWu8ECbAdI1LXOTwGY1aljwkwMUuSnJDzSDAmDFG
3rOC9iQsWRcu1FwuEzrAiECrCRKJcr3MST6sn6xVY6lUkJYx+BgpwkYROY24
Hr62lz9BxyjvgdA+NMA4bU1LDlZEK6nVEK7RYgf38GnwLkkvY8xHwMbi/fXq
ow8U0CXlcgL+/vRfviCnlKOzF8gtcCnRyOgvZeL92SdVBW7AeYe/zssc/v4j
9DkMj2+iEszqKvSeRmkAptYfeC+iZDF8OU8DwrzgfZi05gkU+bbEQYae5iX8
33sewY/fllAl/N/7zwVVgzbhP8pwDU/OU3zwLTyehZH3TZkKUqCD3v+K6O3X
8B/vNb33IiSSA+8M5PMwRecDHqYLMEVT+F4G/tSPMuFPh/AXqSgDhMulsrcZ
dFUI5jutqDiI+pQ0KgpFToo17zrBSChFpx4qEqcOygJCRw41IdCACSvK0edB
MtB5GEUe/H9UE7XJCkgBAA==

-->

</rfc>
