DLAD PGI PART 13 – SIMPLIFIED ACQUISITION PROCEDURES
(Revised September 21, 2012 through PROCLTR 2012-50)
PGI 13.003-90(e) Exclusions from automated procurement.
(i) Only those acquisitions that meet the following criteria are authorized to be excluded from automated procurement processes:
(A) Total dollar value exceeds the simplified acquisition threshold;
(B) Place of inspection code (PIC) equals 3 or 4;
(C) Essential national stock number (NSN) or material data is missing, to include
(1) no approved sources on part number NSNs/materials;
(2) acquisition method suffix code (AMSC) B, G, or T without basic document;
(3) no procurement history when the NSN or material previously purchased;
(4) no PIC; or
(5) no packaging data.
(D) Government-furnished material/property/tooling applies;
(E) First article testing applies;
(F) Items are flight safety critical;
(G) Items are life support equipment;
(H) Export control restrictions apply;
(I) Special data licensing agreements apply; or
(J) Document type equals one of the following:
(1) ZRO – Rush purchase order;
(2) ZTAV – Order fulfillment;
(3) ZSFC – STORES/FFAVORS customer direct;
(4) ZSFD – STORES/FFAVORS DLA direct;
(5) ZN – Order fulfillment TAV processing (request for quote (RFQ));
(6) ZP – Order fulfillment procurement offset or RFQ;
(7) AN – Standard request for quote (RFQ); or
(8) Request for GP bid or RFQ.
(ii) Process for authorizing additional exclusions.
(A) Approval level. The Acquisition Executive must approve any exclusions other than those identified in PGI 13.003(e)(S-90)(i). Approval of the exceptions must be in writing, and a copy of the exception to policy shall be kept in all contract files to which it is applicable. This authority is not delegable.
(1) Exclusion determinations shall include the following information:
(i) Point of contact;
(ii) Exclusion criteria: National stock number (NSN)/ federal supply class (FSC)/quality control code (QCC) or purchase request (PR);
(iii) Level of automation exclusion: Solicitation, evaluation, and/or award;
(iv) Exclusion code – See Appendix A, Automated Procurement Exclusion Codes;
(v) Duration of exclusion: temporary or permanent; and
(vi) Justification for exclusion.
(2) Implementation of approved exclusions. The Acquisition Executive shall forward copies of the determination and justification to the Deputy Director, DLA Acquisition, J7, to the procurement systems process owner, J71, and to the automated solicitation/evaluation/award sub-process owner at DLA Land and Maritime for entry of the following data into the automated procurement exclusion tables:
(i) Federal supply class (FSC);
(ii) NSN;
(iii) Weapon system designator code;
(iv) Project code;
(v) Foreign military sales country code;
(vi) Special procedure code;
(vii) PR number;
(viii) Acquisition method suffix code (AMSC) and FSC combination;
(ix) Quality control code;
(x) Priority and project (all priority 01 and priority 02 and 03 with 999 required delivery dates are excluded agency-wide);
(xi) Advice code (2T or 2F are excluded agency-wide);
(xii) Precious metal indicator code – other than A, U, or null; and
(xiii) Document type.
(3) Reporting. Each supply chain shall submit a quarterly report providing status information and outcomes associated with their automated exclusions list as part of DLA Acquisition’s review and analysis (R&A) metrics review.
(Revised March 20, 2013 through PROCLTR 2013-39)
PGI 13.106-2 Evaluation of quotations or offers.
PGI 13.106-2(b)(S-90)(3)(ii)(D)(1) Past performance evaluation and past performance information systems.
(a) Scope. This subsection prescribes the mandatory procedures, guidance, and instructions for using past performance information systems in evaluating contractor past performance as an evaluation factor for simplified acquisitions.
(b) Definitions.
(1) “Score(s)” as used in this subsection refer to ABVS assessments of a contractor’s delivery and quality performance on DLA contracts.
(2) “Classification(s)” as used in this subsection refer to the PPIRS-SR assessment of a contractor’s delivery and quality performance on past DoD contracts, including DLA contracts.
(i) Classifications are comprised of a delivery score and a quality color ranking.
(c) General past performance information.
(1) Contracting officers are advised not to rely solely on the ABVS,PPIRS-SR, PPIRS-RC or other performance scores/classifications, and should consider reviewing the data used to construct the past performance score if the circumstances of the procurement dictate (e.g., significant price differential or close past performance assessments).
(2) Past performance information used in source selection is confidential source selection information and as such, is protected from release under the procurement integrity rules (see FAR 3.104-4 and 3.104-5). The information is available only to the business entity to which it applies. The past performance information used in the source selection process must carry a restrictive legend substantially the same as the following: “Source Selection Information – see FAR 2.101 and 3.104.” This legend must appear on all hard-copy printouts. Release of past performance information to any non-DLA Governmental entities must have the concurrence of the local counsel. Release to any private entities shall be strictly limited, have the concurrence of the local counsel, and be in accordance with Freedom of Information Act (FOIA, 5 U.S.C. 552) guidelines (see FAR Subpart 24.2, Freedom of Information Act, and DFARS 224.2, Freedom of Information Act). Any FOIA decision to release performance data to other contractors will be made on a case-by-case basis.
(d) Automated best value system (ABVS)
(1) ABVS is a DLA computerized system that collects a contractor’s existing past performance data and translates it into a numeric score. The contracting officer may use the score in evaluating past performance when conducting a comparative evaluation of offers. DLA is migrating from the use of ABVS to the PPIRS-SR.
(2) ABVS Scores:
(i) Contractors receive DLA-assigned ABVS scores for their past performance in each Federal Supply Class (FSC scores). The FSC scores are based on DLA consolidated performance history. A contractor may have multiple FSC scores but will have only one DLA score, which is a compilation of the contractor’s FSC scores for all business conducted with DLA.
(ii) The ABVS score is a combination of a vendor's delivery and quality scores, and the scores range from zero to a perfect score of 100. If a vendor's score is less than 100, DLA provides the contractor the negative data upon which its score is based. These scores are displayed on the “choose awardee” screen in the DLA preaward contracting system (DPACS) for all entered quotes/offers. Higher numerical scores represent high past performance assessments and lower associated risk. Lower numerical scores indicate low past performance assessments and higher risk.
(iii) Scores are calculated daily based upon two years of data.
(iv) Delivery scores are comprised of and calculated as follows:
- Delivery delinquencies
- Number
- Severity
- Contractor caused terminations, cancellations, and withdrawals
Formulas for Delivery Performance | |
Formula |
Legend |
DS = (OW*OS) + (AW*AS) |
DS = Delivery score OW = On-time weight OS = On-time score AW = Average days late weight AS = Average days late score |
OS = 100*O/L |
OS = On-time score O = Number of line shipped on-time during rating period L = Number of line shipped during rating period |
AS = greater of ((100-(D/L)) or 0) |
AS = Average Days Late Score (AS range is 0 to 100 D = Total days late during rating period L = Number of line shipped during rating period Delivery scores are derived from two sub-factors; percent on time and average days late. The relative weights of those factors are set at 0.6 and 0.4, respectively. |
For administrative purposes, the delivery rating period excludes the most recent 60 days. For ABVS purposes, delinquent lines represent shipments not shipped and/or received in their entirety by the Contract Delivery Date (CDD). Contractor caused delivery extensions, regardless of consideration paid, will be reflected in the delivery score.
(v) Quality scores are comprised of and calculated as follows:
- Quality complaints
- Product nonconformances
- Packaging nonconformances
Formulas for Quality Performance | |
Formula |
Legend |
QS = (PRW*PRS) + (PAW*PAS) |
QS = Quality score PRW = Product weight PRS = Product score PAW = Packaging weight PAS = Packaging score |
PRS = 100*(1-(PRC/L)) |
PRS = Product score PRC = Number of product complaints during rating period L = Number of lines shipped during rating period |
PAS = 100*(1-(PAC/L)) |
PAS = Packaging score PAC = Number of packaging complaints during rating period Quality scores are derived from two sub-factors; product complaints and packaging complaints. The relative weights of those factors are set at 0.8 and 0.2, respectively. Contractors having no data in the rating period are assigned scores of 999.9. |
For administrative purposes, the quality rating period excludes the most recent 30 days. Repair, replacement, or reimbursement of quality and packaging defects will not provide relief of negative ABVS data.
The above 60 and 30 day offset periods are not grace periods. Contractor caused discrepancies or delinquencies will be reflected in the ABVS as an indicator of past performance.
(3) Data challenges.
(i) If a contractor’s ABVS FSC score is being challenged, indicated by a “C” on the “choose awardee” screen, contracting officers may wish to consult the ABVS site administrator to assess the magnitude of the challenge and its impact on the contractor’s score.]
(ii) DLA will make negative quality and delivery data reflected in the ABVS score available to contractors daily via the ABVS Website. The contractor’s negative performance data will be posted before it is reflected in the ABVS score (preview period), to give contractors an opportunity to review and verify data. A contractor must challenge any negative data within the preview period to assure corrections are posted before calculation of the score. Contractors must submit challenges and substantiating evidence (e.g. invoices, DD Form 250s, modifications) to the ABVS administrator. The "center" field will identify the appropriate focal point.
(A) For those identified as “DLA Aviation" send challenges to:
DLA Aviation
Attention: DLA Aviation-BPSC (ABVS)
8000 Jefferson-Davis Highway
Richmond, Virginia 23297-5516
Telephone Number: (804) 279-5375
Facsimile (fax) number: (804) 279-5042
Email: dscrabvs@dla.mil
(B) For those identified as “DLA Land and Maritime,” send challenges to:
DLA Land and Maritime
Attention: DLA Land and Maritime -BPSF (ABVS)
Post office box 3990
Columbus, Ohio 43218-3990
Telephone numbers: (614) 692-3383
Facsimile number: (614) 692-4170
Email: dsccabvs@dla.mil
(C) For those identified as “DLA Troop Support,” send challenges to:
DLA Troop Support
Attention: DLA Troop Support-BPSA (ABVS)
700 Robbins Avenue
Philadelphia, Pennsylvani19111-5096
Telephone number: (215) 737-7844
Facsimile number: (215) 737-7949
Email: dscpabvs@dla.mil
(D) The ABVS administrator shall make every effort to resolve data challenges within ten working days. If the contractor and the ABVS Administrator can not arrive at a mutual agreement on challenged data, it becomes disputed data. Disputes which cannot be resolved will be elevated. Authority for resolution of disputed data is one level above the contracting officer. Award decisions resulting from reliance on disputed data must also be approved one level above the contracting officer.
(E) For further details concerning ABVS score calculations and contractor data challenge procedures, refer to the ABVS website at http://www.aviation.dla.mil/UserWeb/proc/ABVM/Abvm.htm.
(e) Past performance information retrieval system – statistical reporting (PPIRS-SR).
(1) PPIRS-SR is a web-enabled, government-wide application that collects quantifiable delivery and quality contractor past performance information from Federal contracting activities.
(2) PPIRS-SR classifications:
(i) Delivery is represented numerically on a 100-point scale, 100 being perfect. Quality assessments, however, are based upon a color-coded, percentile ranking of comparative scores among all contractors with award information for the subject FSC.
(ii) High numbers represent high on-time delivery performance. Lower numbers equate to low on-time deliveries.
(iii) Quality assessments are ranked only for contractors in which inspection records are present for the subject FSC. This is a significant departure from the quality methodology employed by DLA through ABVS. In ABVS, contractors with award history are presumed to demonstrate satisfactory quality, unless discrepant records are received. Conversely, PPIRS-SR only assesses quality for awards in which Government inspection records are required and received. If inspection is not required, the award is not counted in the contractor’s quality assessment.
(iv) Contractors with delivery records but no quality records for an FSC are ranked in the “green” color ranking. Additionally, when no quality records exist for a contractor within an FSC, an asterisk (*) will be displayed in the “quality score” column on the “solicitation inquiry report”.
(v) Because there are occasions when contractors having no quality records may ultimately demonstrate better quality and less performance risk than a contractor in a higher percentile group (“dark blue” and “purple”), absence of quality records does not preclude award to a contractor.
(vi) Information concerning PPIRS-SR classification calculations and contractor data challenge procedures, please refer to the PPIRS-SR procedural guide for application development at: http://www.ppirs.gov/ppirsfiles/pdf/procedural102004.pdf.
(3) Access to PPIRS-SR classifications is accomplished through the PPIRS-SR website: http://www.ppirs.gov/. Contractor classifications may be reviewed and analyzed utilizing the solicitation inquiry report.
(i) PPIRS-SR classifications are established on a Federal Supply Classification (FSC) SR basis.
(ii) Classifications are calculated monthly based upon three years of data.
(iii) Delivery performance is based on the total number of contract line items received and the percent of contract line items with on-time deliveries. Late deliveries have an added weight assessed based upon days late (shown in below table).
Formulas for Delivery Performance | |
Formula |
Inputs |
((1-(Total weight for late deliveries / total line item number)) X 100) |
Days late: Late delivery weight: 6-30 days late 1 31-60 days late 1.5 61-90 days late 2 > 90 days late 2.5 |
(iv) Quality performance formula follows:
(Positive weighted data minus negative weighted data) / Contract FSC line item total
Contractor quality performance is based on a comparison among all contractors within an FSC. Contractors will be grouped by color, representing their ranking within the FSC.
Color rankings are shown in below table:
Color |
Percent group |
Dark blue |
High five percent |
Purple |
Next 10 percent |
Green |
Next 70 percent |
Yellow |
Next 10 percent |
Red |
Last five percent |
Note: If there is only one percentage group for an entire FSC, the group will be classified as green.
Quality performance records to be used and the weight factors for each:
Record |
Service |
Positive Weight |
Negative weight |
Bulletins |
Navy |
N/A |
-1.0 (Red) - 0.7 (Yellow) |
DCMA CAR records (Level III and IV corrective actions – formerly method C/D) |
DCMA |
N/A |
-1.0 (Level 4) -0.7 (Level 3) |
DLA quality records Depot new contract def (doc type 9) Direct vendor delivery def (doc type 6) Medical (doc type B, C and D) |
DLA |
N/A N/A N/A |
- 0.4 - 0.4 - 1.0 |
GIDEP alerts |
All |
N/A |
-1.0 (critical) -0.7 (major) -0.2 (minor) |
* Lab tests (doc type 4) |
DLA |
+.2/ +1 |
-1.0 (critical) -0.7 (major) -0.1 (min) |
Material inspection records (MIRs) |
Navy |
+ 1 |
-1.0 (critical) -0.7 (major) -0.2 (min) |
PQDRs - category 1 (DLA doc type 0) |
All |
N/A |
-1.0 (Cat 1 or Doc Type 0) -0.7 (Cat 2 or Doc Type 1) -0.2 (Info) |
Surveys (excluding pre-award surveys) |
DCMA and Navy |
+0.7 |
-0.7 (others) |
Test reports (1st article, production, etc.) |
Navy |
+0.5 |
-0.5 |
(f) Evaluation using ABVS scores and PPIRS-SR classifications.
(1) The below procedures provide information for using/assessing the information in ABVS and PPIRS-SR. The contracting officer will first evaluate contractors using their ABVS scores for the solicited FSC in effect at the time of evaluation. The contracting officer will use a contractor’s DLA score to evaluate a contractor without an FSC score for that particular FSC. The contracting officer may consider the volume of business on which the FSC score is based as a measure of confidence in the score’s indication of performance risk. The contracting officer may choose to use the DLA score if the volume of business would tend to make the FSC-specific score an inadequate indicator of performance risk. The contracting officer also may use the DLA score if the FSC scores among contractors are relatively equal. For non-NSN items, the contracting officer will evaluate using a contractor’s DLA score in effect at the time of evaluation. Contractors with no performance history will not be evaluated favorably or unfavorably and will be assigned a “999.9” in ABVS. A “999.9” is used to designate those instances wherein the contractor has no past performance history, has no history for the particular FSC or has no history for the timeframe being rated.
(2) In order for the Government to assess performance risk, if the quoter/offeror having the lowest evaluated price also has an ABVS FSC score below 70 and would potentially be bypassed under best value in favor of a higher priced quoter/offeror with a higher ABVS FSC score, then past performance evaluation will be accomplished using PPIRS-SR information, in lieu of ABVS, for all quotes/offers received.
(3) Evaluation of PPIRS-SR delivery assessments will be based upon a contractor’s score on a numerical scale ranging from 1 (low) to 100 (high). High numbers represent high on-time delivery performance. Lower numbers equate to lower on-time deliveries. A ‘0’ (zero) score with ‘0’ (zero) lines is used to designate those instances where the contractor has no history for the particular FSC being rated.
(4) In PPIRS-SR, contractor quality will be assessed based upon color/percentile groups. Contractor quality assessments will be evaluated as follows:
Dark blue assessments will be evaluated more favorably than purple assessments;
Purple assessments will be evaluated more favorably than green assessments;
Green assessments will be evaluated more favorably than yellow assessments; and,
Yellow assessments will be evaluated more favorably than red assessments.
(5) Contractors with delivery records but no quality records for an FSC are ranked in the “green” color ranking. Additionally, when no quality records exist for a contractor within an FSC, an asterisk (*) will be displayed instead of an actual quality score. Contractors with delivery records but without quality records will be evaluated as having no negative quality records within the PPIRS-SR.
(6) In the case of a contractor without a record of relevant past performance or for whom information on past performance is not available in the PPIRS-SR, the contractor will be evaluated neither favorably nor unfavorably on past performance.
(h) Award justification.
(1) Contract files must be documented with the rationale supporting all award decisions. The award decision must demonstrate how paying more than low price reduces performance risk or otherwise benefits the government. The award justification must be commensurate with the price difference between the awardee and the low quote/offer, i.e., the greater the difference in price, the stronger the award justification importance.
(2) For ABVS and PPIRS-SR awards, justification templates are available in DPACS to assist in file documentation. Choose the award justification template that most closely represents the particulars of the current award and enter requested information. These forms may be supplemented with additional information, as necessary.
PGI 13.106-3(b) File documentation and retention.
Simplified Acquisition Pricing Memorandum FAR 13.106-3(b), DLAD 13.106-3(b)(S-90)(1)(i) |
Purchase request or call/order number: _______________ Price reasonableness code (PRC): _________ Award number: _______________________ National stock number (NSN): _________________________ Award date: ________________ Item/service: _______________________ Quantity: ___________ |
A. Price Reasonableness (Check one or more blocks) |
[ ] Adequate price competition (FAR 13.106-3(a) and/or 15.404-1(b)(2)(i)) [ ] Best value evaluation (If awarding to other than low offer, explain on attachment) (DLAD Appendix C, section 3.8.2.1) [ ] Federal Supply Schedule (FSS) combined with Non-FSS (FAR 8.402(f)): Explain the basis for finding the non-FSS item(s) fair and reasonable priced: ________________________________________________________________________ |
[ ] Comparison with prior purchase price(s) of [ ] same or [ ] similar item (FAR 13.106-3(a)(2)(ii) and (iv) and DLAD 13.106-3(b)(S-90)(1)(iii)). Item name and number: __________________________________________________ Contract/purchase order: _________________________________________________ Vendor: ________________________________________________________________ Unit of issue: _________ Unit price: _____________ Quantity: ___________ Award Date: _______________ [ ] Commercial catalog/published price list [ ] same or [ ] similar item (FAR 12.209 and 13.106-3(a)(2)(iii) and DLAD 13.106-3(b)(S-90)(1)(iii)). Catalog/price list name/number: ___________________________________________ Date: _______________________ Page (if applicable): _______________________ [ ] Established market price (FAR 12.209): ________________________________ Means of verification: ___________________________________________________ [ ] Price(s) set by law or regulation (FAR 15.403-1(c)(2)). [ ] Order placed against GSA Schedule (FAR 8.404(d)) (attach screen shot from GSA e-Buy). [ ] Order for services with SOW placed against GSA Schedule – evaluated level of effort and mix of labor proposed - overall price is fair and reasonable (FAR 8.405-2(d)). [ ] AbilityOne Program/National Industries for the Blind (NIB)/National Industries for the Severely Handicapped (NISH) (FAR 8.707). [ ] Value/visual/cost analysis (DLAD 13.106-3(b)(S-90)(2)(iii) and (v)) (attach analysis). Additional comments: ________________________________________________ _____ First Destination Packaging (FDP) program evaluation (see DLAD 13.106-2(b)(S-90)(3)(ii)(D)(2)): Compare the level of the packaging requirement (or state “Not applicable”): Current buy ______________________ Previous buy ______________________ Unit of issue ____________ unit price ____________ quantity ______________ Packaging changed from military to commercial: Yes or No Packaging negotiated: Yes or no (or not applicable): Yes: $ or % __________________________________________ No: No change from commercial packaging or no change in packaging requirements since last buy. Savings realized: yes or no |
B. Reasons For Soliciting Only One Source – FAR 13.106-1(b) |
[ ] Sole source (only known and approved source). [ ] Other: _____________________________________________________________ [ ] Sole source (purchase order text (pot)). [ ] Brand name, urgency, exclusive licensing agreement, industrial mobilization. Additional comments for selection: _________________________________________ |
[ ] Item is a unique part number and no similar item is known. [ ] Per vendor on (date): ______________________. [ ] Per purchase request referral with ____________________, product specialist. [ ] Per market research/experience using catalogs and other available data – attached. [ ] Sole source is not above 25% above price increase. If above 25%, complete Section C. _________________________________________________________________________ C. 25% Price Increase Limitations Certification For Sole Source Awards in accordance with DFARS 217.7505 |
The Contracting Officer hereby certifies to the Head of the Contracting Activity (HCA) in accordance with DFARS 217.7505, Limitations on Price Increases, that this sole source price has increased by 25% or more within the most recent 12-month period but in accordance with the information provided below, it has been determined to be fair and reasonable. This section applies to the acquisition of replenishment parts and repairable or consumable parts. Contractor’s name/contractor and Government entity (CAGE) code: ____________________________________________________________________ Total amount of the proposed award: __________________________________ Expected fair and reasonable price per unit: ___________ Negotiated price per unit: ____________ Quantity: ________ Most recent lowest price per unit within a 12-month period: ________ Price per unit difference: _______ Quantity: _______ Total price increase: ___________________________________ Percent of difference: ___________________________ Prior/current contractor: ___________________________________ Prior award date: ________________________ [ ] Check block if Head of the Contracting Activity (HCA) certification has been verified and retain a record in the official electronic contract file. |
Summarize below the reasons for the price increase and price reasonableness: |
_______________________________________________________________________ Acquisition specialist and date (Signature is required for all memorandums) _______________________________________________________________________ Contracting Officer and date (Signature is required for all memorandums) (The acquisition specialist shall scan and file the memorandum and supporting documentation in the current electronic contract writing system. |
PGI SUBPART 13.4 – FAST PAYMENT PROCEDURES
(Revised July 1, 2013 through PROCLTR 2013-59, dated June 26, 2013)
PGI 13.402 Conditions for use.
(a) – (e) [Reserved.]
(f)(1)(S-91)(A) (i) Use of fast pay requires a true and complete documentary reconciliation process by which receipt documentation (receipt acknowledgement or receiving report) is compared with contracting (purchase order or requisition) and accounting and payment data (invoice), after payment has been made by the Defense Finance and Accounting Service (DFAS). Actual receipts must be compared with order/contract requirements and actual payments to determine whether the vendor was paid properly.
(ii) While 100-percent matching of these transactions for verification purposes may not be feasible, lack of adequate reconciliation could constitute an Anti-Deficiency Act violation. The minimum acceptable verification or post-disbursement matching, is review of a statistically valid sample of acquisitions valued up to $100,000, and of OCONUS subsistence transactions up to $200,000, for which fast pay has been used.
(iii) Every non-subsistence transaction in fast pay acquisitions with values greater than $100,000 and, for subsistence, greater than $200,000, must undergo this reconciliation process; sampling is not permitted for these acquisitions.
(iv) Except for tires, OCONUS subsistence and OCONUS medical, fast pay shall not be used for transactions greater than $100,000. Fast pay shall not be used for transactions where the underlying contract does not contain FAR clause 52.213-1 and DLAD clause 52.213-9009 (both titled Fast Payment Procedure and, for commercial item acquisitions, DLAD clause 52.212-9001.
(B) Acquisition specialists, with the assistance of DLA customer account specialists and order fulfillment resolution specialists, are to perform fast-pay validations. The contracting personnel are responsible for resolving delinquent contract lines; order fulfillment specialists deal with sales order closures. The debt-collection aspect of fast-pay reconciliation (for a contractor’s failure to repair, replace, or correct supplies that were lost, damaged, or not conforming to purchase requirements) is the responsibility of the contracting officer. This is an essential part of the overall contract closeout process, even for simplified acquisitions. Assistance may also be requested from financial customer liaisons and other financial operations (J8) personnel (who track down missing receipts), and from DFAS, as required.
(C) (i) Verification shall be performed by each supply chain on a quarterly basis beginning with the start of the fiscal year. Every fast-pay payment for transactions valued at $100,000 ($200,000 for OCONUS Subsistence) or less that has been made since the last verification period shall have an equal opportunity of being included in the random sample for reconciliation. For each quarter, all such fast-pay payments made over that quarter shall be included in the pool from which the sample is selected for verification.
(ii) To the extent that fast pay is featured in automated, as well as manual, transactions, both automated and manual transactions shall be included in the universe from which the random sample is selected. As stated in subparagraph (f)(1)(91)(B), above, all transactions valued at greater than $100,000 ($200,000 for OCONUS Subsistence) shall be individually reconciled (that is, all BRAC tire acquisitions, any Medical OCONUS DVDs, and OCONUS Subsistence awards greater than $200,000); they shall not be included in the universe for random sampling purposes. The pool from which transactions will be selected for verification is not limited to those on which non-receipts and non-conformances have previously been brought to the contractor’s attention for resolution, or which are otherwise problematic (e.g., orders for which the customer never sent in a receipt acknowledgement).
(iii) Sampling shall not be performed on a “management by exception” basis. Consideration should be given to increasing the sample size or performing a 100% review for suppliers/customers with a problematic record.
(D)(i) DCAA’s “E-Z Quant” methodology constitutes a standardized sampling plan that satisfies the requirements for randomness, confidence, and statistical validity, and that provides for sampling on an automated or manual basis. Use of this method will produce an appropriate sample for submittal to the reconciliation process.
(ii) Once the sample has been obtained via E-Z Quant, contracting personnel shall perform receipt validation on each of the orders in the sample. The preferred method of receipt verification is evidence of a material receipt acknowledgement (MRA). The MRA is a systems driven transaction generated when the receiving activity’s accountable record is updated. For the purposes of fast pay validations, evidence of an MRA signifies delivery of the requisitioned material by the vendor. Supply chains may pull MRA data as part of a comprehensive data pull to support fast pay verifications or they may utilize the EBS report entitled “missing goods receipt for fast pay orders.” Those with missing or partial MRAs must be further researched to determine whether the customer received the items.
(iii) Past experience has shown that there are systemic issues which may prevent the MRA from posting in SAP, thus, contracting personnel must have the ability to validate customer receipt through alternative means. When an MRA is not available, contracting personnel may validate receipt via a receiving report in wide area workflow (WAWF), proof of delivery (POD) from the transporter, acknowledgement of receipt by the requisitioner or through some other means of verifying that the customer has received the material. The method of validation must be noted in the verification reports provided to J71.
(E)(i) Reconciliation results shall be reported by each supply chain to J7, attention: J71, within 60 days after posting of fast payment data for the verification period by DLA Office of Operations Research and Resource Analysis (DORRA) but not earlier than May 15 for Quarter 1 data, August 15 for Quarter 2 data, November 15 for Quarter 3 data, and February 15 for Quarter 4 data. Reports will follow a one quarter lag. For example, the verification report for the first quarter, October through December, will be due to J71 60 days after posting of fast payment data for the second quarter verification period but not earlier than May 15. The lag is intended to allow for sufficient time for customer receipts to post. J71 will review the results for negative trends and assess the need for potential corrective actions. J71 will also share results with J73 as supporting data to procurement management reviews.
(ii) If the reconciliation exposes discrepancies (including, but not limited to, unmatched disbursements, overages, underage’s, or uncorrected goods nonconformance’s) between the actual customer receipts and either vendor invoices or receipts generated by the system, or both, in more than 5 percent of the sample, an additional sample, of the same size as the original group subjected to reconciliation, and excluding the transactions previously sampled, shall undergo the verification process. If more than 5 percent of this second sample is also discrepant, J7 may require verification of additional or all transactions.
(F) In addition to the post-payment verification reports to be performed by each supply chain, proper application of fast payment procedures (i.e. all conditions met and clauses included) and documentation of the contract files as required by (f)(1) will be a part of all regular procurement management reviews.