Why are delivery error rates different between API and UI?

Summary

Delivery error rate discrepancies between APIs and UIs arise from a combination of factors. APIs typically present real-time, raw data, whereas UIs often display aggregated, summarized, filtered, or cached data. These differences stem from variations in data processing pipelines, tracking methodologies, data aggregation methods, the handling of retries and bounces, timezone calculations, bot filtering, and the level of detail displayed. GPT breakdowns in UIs have also been noted as inaccurate. Key considerations include verifying data integrity and scaling for GPT graphs.

Key findings

  • Data Real-Time vs. Aggregated: APIs provide real-time or near real-time data, while UIs display aggregated or summarized data.
  • Data Processing Pipelines: Differences in processing pipelines contribute to variations in error rates.
  • Data Filtering/Classifications: UIs may apply filters or classifications, altering the error rates.
  • Data Caching: UI dashboards often use cached data, causing discrepancies.
  • Retry Handling: APIs may count initial failures, while UIs show final delivery status.
  • Bounce Definitions: Different definitions of bounces affect error rate calculations.
  • Timezone Calculations: Timezone differences can impact daily metric reporting.
  • Bot Filtering: UIs may filter bot traffic, affecting error rates.
  • GPT Dashboard Inaccuracies: GPT breakdowns can be widely inaccurate.
  • GPT Scaling Issues: API scaling discrepancies for GPT graphs need consideration.

Key considerations

  • API Raw Data Verification: Check API raw response for calculation errors.
  • Data Integrity Verification: Verify details of delivery error rates for alignment.
  • Data Source Appropriateness: Select the appropriate data source (API or UI) based on the use case.
  • Understanding Data Aggregation: Comprehend aggregation methods to reconcile discrepancies.
  • Code Correctness: If API data is used then ensure the code that pulls the data is correct.
  • GPT Analysis: Approximate, but not identically replicate, the GPT graphs from the API data for validity.

What email marketers say
12Marketer opinions

Discrepancies between API and UI delivery error rates stem from several factors. The UI often presents aggregated, summarized, or cached data, while the API provides more granular, real-time, and potentially unfiltered data. Variations in data processing pipelines, filtering of bot traffic, timezone calculations, the handling of retries and bounce classifications (hard vs. soft), and the use of different timeframes for calculations all contribute to these differences. It's also noted that dashboard inaccuracies can happen and the scaling of any GPT graphs needs to be considered.

Key opinions

  • Data Aggregation: The UI typically aggregates data, potentially hiding granular errors visible through the API.
  • Real-time vs. Cached Data: The API often provides real-time data, while the UI might use cached data, leading to inconsistencies.
  • Filtering and Calculations: The UI often applies filters (e.g., bot traffic) and calculations that are not present in the raw API data.
  • Retry Handling: The API may count initial delivery failures (even if successful on retry), while the UI reflects the final delivery status.
  • Bounce Classifications: Differences in defining and classifying bounces (hard vs. soft) between the API and UI can cause varying rates.
  • Timezone Calculations: Different timezone calculations (UTC vs. local) can impact daily metrics reported by the API and UI.
  • Dashboard inaccuracies: GPT dashboard readings for errors are often wildly inaccurate.
  • Scaling inaccuracies: API scaling discrepancies in the GPT graphs can be an issue.

Key considerations

  • Data Granularity: Understand the level of detail provided by both the API and the UI and how the data is aggregated in each.
  • Processing Pipelines: Be aware of the different processing pipelines used for API and UI data and how these might impact the reported metrics.
  • Data Definitions: Clarify how metrics like bounces and delivery errors are defined and calculated in both the API and the UI.
  • Data Latency: Consider the potential latency in UI data due to caching or batch processing.
  • Intended Use Case: Determine whether the API data should be used, and if it is the coding is correct. Consider GPT inaccurate breakdowns.
Marketer view

Email marketer from StackExchange notes that API endpoints may have different levels of data granularity compared to the UI. The UI might aggregate data to simplify presentation, which could mask specific errors visible in the API.

July 2023 - StackExchange
Marketer view

Email marketer from Quora shares that UI delivery metrics may be calculated using a different timeframe than API data. The UI might show a daily summary, while the API could report hourly or even more granular data.

May 2024 - Quora
Marketer view

Email marketer from EmailGeeks Slack Community states that differences can stem from how the data is collected, particularly regarding retries. The API may count a 'failed' delivery which is later successful on a retry, where the UI presents the final delivery status.

October 2023 - EmailGeeks Slack Community
Marketer view

Email marketer from Email Analytics Forum replies that differences can come from different timezone calculations. The API might report data based on UTC, while the UI uses the user's local timezone, which can impact daily metrics.

July 2023 - Email Analytics Forum
Marketer view

Marketer from Email Geeks advises ignoring GPT breakdowns of delivery errors due to frequent inaccuracies. Suggests ruling out scaling discrepancies in the API code and approximating GPT graphs from the API data to identify interpretation issues.

January 2022 - Email Geeks
Marketer view

Marketer from Email Geeks suggests verifying the details of the delivery error rate to see if they align and highlights the oddity of a 40% overall delivery error rate with 0.0% detail points, noting Google may redact data.

September 2024 - Email Geeks
Marketer view

Marketer from Email Geeks suggests checking the API raw response for calculation errors and notes discrepancies have been observed before.

December 2022 - Email Geeks
Marketer view

Email marketer from Email Marketing Forum responds that UI dashboards often use cached data for faster loading times, while the API provides real-time, uncached data. This caching can cause discrepancies.

May 2023 - Email Marketing Forum
Marketer view

Marketer from Email Geeks shares that they frequently see different delivery error rate numbers in the line chart versus the detail when you click on a date in the line chart.

January 2024 - Email Geeks
Marketer view

Email marketer from Reddit suggests that the API might be pulling data directly from the source, while the UI applies certain filters or calculations before displaying the results, leading to different error rates.

March 2024 - Reddit
Marketer view

Email marketer from MarketingProfs Forum responds that some platforms filter out bot traffic or invalid email addresses in the UI, but not in the raw API data. This filtering can create discrepancies in delivery and error rates.

July 2023 - MarketingProfs Forum
Marketer view

Email marketer from Email Vendor Expert blogs that API and UI inconsistencies can be due to how different metrics are defined, especially bounces. 'Hard' vs. 'soft' bounces may be treated differently in calculations between the API and UI.

September 2021 - Email Vendor Expert

What the experts say
2Expert opinions

Experts agree that inconsistencies in delivery error rates between APIs and UIs arise from differing data processing and reporting methods. APIs often present raw, real-time data, while UIs use processed, summarized, or filtered data, leading to variations in reported error rates. These differences stem from disparities in tracking methodologies, processing times, and data aggregation approaches.

Key opinions

  • Different Data States: APIs provide raw, real-time data, whereas UIs show processed and summarized data.
  • Tracking Methodologies: Differing tracking methods between APIs and UIs contribute to inconsistent error rates.
  • Data Filtering: UIs may apply filters or classifications to the raw data, altering the error rates reported compared to the API.
  • Processing Time: API's deliver almost real-time data, UIs can be delayed for processing.

Key considerations

  • Data Processing Awareness: Be aware of the data processing steps applied by the UI and how they differ from the raw data provided by the API.
  • Tracking Methodology Alignment: Understand the tracking methodologies used in both the API and the UI to reconcile differences in reported error rates.
  • Data Usage Context: Consider the intended use case for the data and choose the appropriate source (API or UI) based on the required level of detail and processing.
Expert view

Expert from Spam Resource explains that different definitions and tracking methodologies between the API and UI can lead to inconsistencies. The API might report raw data, while the UI applies filters or classifications that alter the error rates.

January 2022 - Spam Resource
Expert view

Expert from Word to the Wise, Laura Atkins, responds that API and UI discrepancies can arise from different processing times and data aggregation methods. The UI might reflect processed and summarized data, whereas the API may provide unprocessed, real-time data.

February 2025 - Word to the Wise

What the documentation says
5Technical articles

Email service provider documentation consistently indicates that discrepancies in delivery error rates between APIs and UIs stem from differences in data processing, aggregation, and reporting. APIs typically offer real-time or near real-time data reflecting immediate activity, whereas UIs often display aggregated, summarized, or delayed data. Factors like processing variations, caching, data sampling, and different levels of detail also contribute to these discrepancies.

Key findings

  • Real-time vs. Aggregated Data: APIs provide real-time or near real-time data, while UIs present aggregated or summarized data.
  • Processing Pipelines: Different processing pipelines for API and UI data contribute to variances in reported rates.
  • Data Sampling/Estimation: UI reporting may use data sampling or estimation for performance reasons, while API data reflects full records.
  • Level of Detail: UI dashboards and APIs can have different levels of detail and aggregation, leading to discrepancies.
  • Data Delay: UI reports might display data with slight delays, which are not present in the API data.

Key considerations

  • Data Latency: Acknowledge potential data latency in UI displays due to aggregation and processing.
  • Data Source Selection: Select the appropriate data source (API or UI) based on the required level of detail, real-time vs. aggregated views, and potential for estimation/sampling.
  • Consistency Checks: Regularly compare API and UI data to ensure consistency and identify potential issues with data processing or reporting.
  • Understand Aggregation Methods: Be aware of the aggregation methods used for data presented in the UI and their potential impact on reported error rates.
Technical article

Documentation from Postmark details that UI reporting could be subject to data sampling or estimation for performance reasons, whilst API data represents full records. Sampling can cause minor differences in delivery error rates.

February 2025 - Postmark
Technical article

Documentation from Mailgun explains that differences can arise due to the API providing real-time data while the UI might display aggregated or slightly delayed data. Processing variations and caching mechanisms can also contribute.

December 2021 - Mailgun
Technical article

Documentation from Amazon SES explains that API results provide near real-time data reflecting immediate activity, whereas the console (UI) might reflect batch processed data with delays. Differences in data aggregation methods can also contribute to varying rates.

May 2022 - Amazon SES
Technical article

Documentation from SendGrid answers that discrepancies can occur because of different processing pipelines for API and UI data. The UI often presents summarized data, whereas the API provides more granular, real-time information which can lead to variances when aggregated manually.

September 2022 - SendGrid
Technical article

Documentation from SparkPost indicates that UI dashboards and APIs might have different levels of detail and aggregation. For example, the UI might not display temporary errors that are visible via the API, leading to discrepancies in reported rates.

May 2022 - SparkPost