The impact of vehicular noise on acoustic indices within simulated bird assemblage soundscapes

ABSTRACT Passive acoustic monitoring (PAM) is a sampling technique that has gained increasing popularity in the field of wildlife monitoring and research since it allows for non-invasive and cost-effective collection of acoustic information. Retrieving biological information from PAM recordings can often involve time-consuming sound annotation methodologies, but the advent of acoustic indices can help expedite this process. While correlations between acoustic indices and species richness have been observed in a variety of ecological contexts, these relationships faulter in environments with increased vehicular noise. Here, we assessed the direct impact of vehicular noise on nine acoustic indices through controlled manipulation of vehicular noise within computer-generated bird assemblage soundscapes. Our results demonstrate that recording distance from roadsides and number of passing cars per minute have notable and persistent impacts on acoustic index values, but the magnitude of the effect varies across indices. Four acoustic indices demonstrated greater resilience to vehicular noise interference and may therefore be better suited for developed areas: Bioacoustic Index, Acoustic Complexity Index, Acoustic Diversity Index, and Acoustic Evenness Index. By contributing to the collective understanding of acoustic index behaviours under anthropogenic noise pollution, we hope to better inform their ecological application in human-developed contexts.


Introduction
As human populations across the world become increasingly urbanised, it is important to understand the role that developed environments play in the maintenance of biodiversity (Elmqvist et al. 2013).Anthropogenic noise (i.e.anthrophony), including omnipresent sounds found in cities (e.g.vehicular traffic, construction), can be a considerable environmental stressor for many wildlife species within and around developed environments (Kleist et al. 2018; P. V. A. Ribeiro et al. 2022).Although the impact of noise pollution has long been a subject of interest for wildlife ecologists (Sordello et al. 2020), conventional sampling techniques (e.g. point counts, line transects) may limit the temporal and spatial scale at which impacts can be observed unless substantial human effort is utilised (Sugai et al. 2019).Additionally, for taxa that primarily rely on auditory cues for communication such as birds, the presence of anthropogenic noise events may increase avoidance behaviours, potentially causing ecological surveys to underreport species richness estimates (Carral-Murrieta et al. 2020).The technique of passive acoustic monitoring (PAM) has since become increasingly more accessible and favourable for addressing ecological questions revolving around anthropogenic noise pollution (J.W. Ribeiro et al. 2017).PAM involves the deployment of autonomous recording units (ARUs) that are programmed to record audio in the field for defined periods of time.With the advent of PAM, new avenues of research regarding anthropogenic noise effects are possible, including long-term monitoring of noise pollution exposure (Erbe 2013), mapping the extent and intensity of noise disturbances (Desjonquères et al. 2020), and analysing correlations between anthrophony and changes in biotic noises within the same recordings (Alvarez-Berríos et al. 2016).
Acoustic indices, inspired by traditional species diversity indices, are numerical indications meant to characterise biological information based on the sound characteristics produced in an area (Sueur et al. 2014).Acoustic indices have gained increasing popularity due to their ability to quickly summarise biological information from large quantities of recordings created by PAM, which has historically been a bottleneck in the processing of acoustic data (Alcocer et al. 2022).To date, several acoustic indices have been developed, each with a unique biological principle, usage, and interpretation.The most common application of acoustic indices, despite mixed reliability, is to estimate biodiversity metrics such as species richness, a process that may otherwise involve timeconsuming sound annotation methodologies (Alcocer et al. 2022).
While correlations between acoustic indices and species richness have been observed in a variety of ecological contexts (Eldridge et al. 2018;Bradfer-Lawrence et al. 2020;Dröge et al. 2021), these correlations often falter in environments with anthropogenic noise (Fairbrass et al. 2017).Specifically, anthropogenic noise can cause sound distortions and mask biological signals of interest, potentially leading to biased acoustic index estimates (Gasc et al. 2015;Buxton et al. 2018).Though developed areas emit many sources of anthropogenic noise, vehicular noise is recognised as a prominent source of anthropogenic sound disturbance (Singh and Davar 2004;Buxton et al. 2019).Past studies illustrate that vehicular noise can bias acoustic index values (Fuller et al. 2015;Fairbrass et al. 2017), however it is unclear whether the magnitude of the biases are constant across different traffic conditions, and the extent to which vehicular noise skews our estimates of biodiversity (Quinn et al. 2022).Additionally, a lack of solutions for unbiased acoustic indices in the presence of vehicular noise exist, leading researchers to avoid areas with high vehicular noise levels for ARUs deployment with intent on using acoustic indices (Alcocer et al. 2022).While high-pass sound filters are commonly implemented for this purpose (Towsey et al. 2014;Khanaposhtani et al. 2019;Bradfer-Lawrence et al. 2020), their effects on acoustic indices are variable depending on environmental conditions (Hyland et al. 2023) and theoretically would not be able to mask noises present at higher frequencies, such as vehicular noise at short distances.Thus, an analysis of acoustic index behaviours at varying degrees of vehicular noise will lead to greater resolution regarding when and where acoustic indices will accurately estimate diversity metrics.
Here, we assess the impact of vehicular noise on acoustic indices through controlled manipulation of vehicular noise within simulated (i.e. computer generated) bird assemblage soundscapes (hereafter referred to as 'soundscapes').Simulated soundscapes present a unique opportunity to analyse acoustic indices in a controlled environment through the combination of sound clips and computer-generated noise within an audio editing software.When sound clips and computer-generated noise are combined with biological and ecological reasoning in mind, these soundscapes can simulate acoustic characteristics of interest present within natural environments (Gasc et al. 2015).By explicitly controlling the vocalising bird community and vehicular noise frequency and magnitude, we hope to better qualify the direct impacts of vehicular noise on the effectiveness of acoustic indices.Additionally, we aim to illustrate the effect vehicular noise may have on species richness estimates derived from acoustic indices.This research will ideally foster more accurate information from future monitoring efforts involving acoustic indices in developed contexts aimed at informing wildlife conservation and management.

Methods
Simulating bird assemblage soundscapes allows for the analysis of acoustic indices in an environment in which the number of birds vocalising per minute, the volume of bird vocalisations, and extraneous sources of acoustic variation (e.g.wind or anthropogenic noise) can be controlled.Our process involved the creation of 46 randomised bird assemblages, each of which was used to create 12 'treatment' soundscapes with different magnitudes of vehicular noise interference, as well a 'control' soundscape with no added vehicular noise (Figure 1).

Randomized bird assemblages
We created a suite of randomised bird assemblages to simulate real vocalising bird communities.A vocalising species is defined as a bird species that regularly vocalises and could be identified from these vocalisations alone.Specific species in each bird assemblage were chosen by randomly sampling the eBird database for user submitted checklists containing at least four species from eastern United States during May through July 2022 (Sullivan et al. 2009).This region was chosen because most vocalising bird species within it have breeding seasons within the selected time frame, and thus represents a temporal and spatial range in which many bird species would be expected to be vocalising simultaneously.We then randomly selected four vocalising species from each sampled checklist as this was the closest integer to the mean number (3.98) of vocalising bird species detected in a one-minute interval within recent sound recordings in the region (Pease et al., unpublished data).In total, 77 unique bird species were represented within the assemblages (Table A1).Although a single bird species may be present in several assemblages, there were no identical assemblages.Using vocalising bird species from eBird checklists helps ensure that the randomised assemblages will reflect real assemblages of bird species that would be expected to vocalise at the same time.

Control soundscape creation
Sound clips of selected bird species were sourced from The Cornell Guide to Bird Sounds: Master Set for North America (Version 2021) (2022).Representative clips for each species were chosen based on the ability to isolate the vocalisation of interest from other bird vocalisations or background noise as some sound clips contain several vocalising species.Vocalisations from a single individual were prioritised over duets or flock recordings.For each clip, background noise and extraneous bird vocalisations were removed using high-pass and low-pass filters in Audacity (Version 3.2.3)(2022), which increasingly attenuates sounds as their frequency is further removed from the designated cut-off.Each clip shorter than one minute was extended to a one-minute duration through adjacent repetition of the clip.For each assemblage, a one-minute soundscape was created by overlapping the four selected species' sound clips into a single wave file.We acknowledge that this methodology may not accurately replicate the complex temporal organisation of song bouts between cooccurring species.Each soundscape was then adjusted to an amplitude level of 20 dB using the 'Amplify' effect in Audacity.
We then added background noise to simulate real recordings retrieved from ARUs during PAM efforts.Using the 'Generate noise' effect within Audacity, pink noise with an amplitude of 0.03 dB was added to each soundscape to better replicate field recordings of bird communities using AudioMoths (Hill et al. 2019) and similar ARUs (Figure 2).In total, 34 control soundscapes contained an assemblage of 4 randomly selected birds.We chose 34 replicates based on a priori power analysis using G* Power (Version 3.1.9.7) Figure 1.Workflow of the soundscape creation process for each bird assemblage (n=46).For each randomized bird assemblage, bird vocalizations and computer-generated pink noise were used to create one control soundscape.Pink noise was added to simulate background noise present in real recordings retrieved from ARUs.These same audio elements were then used to create 12 treatment soundscapes, each with different levels of vehicular noise addition to simulate ARU recordings at 3 combinations of deployment distances from roadsides (50 m, 150 m, and 250 m) and traffic level (1, 3, 5, and 7 passing cars per minute).(Faul et al. 2007).In addition, 12 control soundscapes containing assemblages of 3, 8, and 12 randomly selected species (4 soundscapes each) were created to assess how the number of species influences acoustic index values.

Vehicular noise addition
Each of the 46 control soundscapes were used to develop additional one-minute soundscapes that replicate the vehicular noise interference experienced within an ARU recording at different distances from a roadside and at different traffic levels.Specifically, soundscapes were created to replicate vehicular noise interference at 50 m, 150 m, and 250 m from a roadside and at traffic levels of 1, 3, 5, and 7 cars per minute (Figure 1).These traffic levels correspond to annual average daily traffic (AADT) levels of 1440, 4320, 7200, and 10,080, respectively (e.g.1440/(1440 minutes in one day) = 1 car per minute).In turn, "these traffic levels" reflect a range of AADT levels for roadways across the continental United States, excluding major interstate highways.Additionally, these distances were chosen because bird presence and regular breeding were not significantly affected by an AADT of 10,080 or less past 400 m (Forman et al. 2002), and the amplitude and duration of vehicular noise were notably similar within ARU recordings between 250 and 400 m from the roadside.
Vehicular noise was added to each soundscape using a 38-second sound effect clip from the YouTube Audio Library ("Car Drive By" 2023).The spectrogram of this sound clip presented a similar acoustic pattern and frequency level to previous ARU recordings taken of multiple vehicles of different makes and models, and thus was deemed appropriate for use.Although various vehicles exist in typical traffic, we chose to use this single clip as it reflects typical passenger vehicles and the use of a single clip may better isolate any effect of vehicles on acoustic indices.Traffic levels of 1, 3, 5, or 7 determined how many times the sound clip was added into each soundscape (e.g. a soundscape of a traffic level of 3 would have three vehicle sound clips added).The start of each vehicle noise added to each soundscape was determined using a random number generator from 0 to 59, with the result corresponding to the second at which the midpoint of the vehicular noise clip played, with sound clips overlapping when appropriate.The distance from roadside levels determined the amplitude and duration of the added vehicle sound clips, with closer distances having a greater amplitude and lower duration.To develop accurate duration and amplitude at each distance, vehicles were recorded with an ARU within a fallow field at each proposed distance from a roadside on a windless day with no other acoustically detectable roads nearby.The amplitude and duration of the vehicle sound clips were edited using the 'Amplify' and 'Change Tempo' effects in Audacity, respectively, in order to reflect duration and amplitude peaks of the empirical recordings, as illustrated using the 'Plot Spectrum' analysis tool.
In total, 598 soundscapes were created: 46 control soundscapes with no vehicle noise and 552 treatment soundscapes representing each combination of traffic levels and distance from roadside (3 distances × 4 traffic levels × 46 soundscapes).

Acoustic indices
We analysed the impact of vehicular noise on nine acoustic indices.These indices were chosen based on their widespread usage in the published literature for estimating avian species richness (Table 1; Alcocer et al. 2022).Acoustic indices were calculated using the seewave (Sueur et al. 2008) and soundecology (Villanueva-Rivera and Pijanowski 2018) packages in program R (Version 4.2.2).In soundecology, the default settings of the functions 'bioacoustic_index', 'acoustic_complexity', 'acoustic_diversity', and 'acousti-c_evenness' were used to calculate BI, ACI, ADI, and AEI, respectively.In seewave, the default settings of the 'H', 'M', and 'AR' functions were used and the default settings of the 'NDSI' function were used with the exception of max argument set to TRUE to better reflect the original formulation used in Kasten et al. (2012).The Number of Frequency Peaks (NP) was calculated according to the methodologies of Gasc et al. (2013) using the functions 'meanspec' and 'fpeaks' in seewave (Sueur et al. 2008;Sueur 2018).Notably, all examined acoustic indices were developed with the intent to either monitor avian communities, or to summarise overall animal acoustic diversity.Therefore, default settings were primarily used in order to maintain wider applicability and comparability of our results with other studies, though we recognise that this in turn may limit the applicability of obtained acoustic index behaviours to other taxonomic groups or nonterrestrial environments (Alcocer et al. 2022;Bradfer-Lawrence et al. 2023).

Analysis
The nine acoustic indices were calculated for the 598 bird assemblage soundscapes.To ensure that observed changes in acoustic index values caused by vehicular noise addition were not influenced by the number of species in each assemblage, we first created linear mixed-effects models using 52 soundscapes with 4 species (i.e. 13 soundscapes for each of the first 4 bird assemblages), and the 156 soundscapes that had 2, 8, or 12 species, resulting in a total of 208 soundscapes for this preliminary test.A linear mixed-effects model was created for each of the nine acoustic indices (response variable) using the 'lmer' function of the lme4 package (Bates et al. 2015) in program R (Version 4.2.2) which predicted respective acoustic index using the fixed effect variables of cars per minute, simulated distance from roadside, number of species, the interactive effect between cars per minute and simulated distance, and the random effect variable of assemblage identity.All indicator variables were treated as categorical factors.The 16 control soundscapes containing no vehicular noise were designated as reference treatments for each model.For the reference treatments, the number of cars was designated as 0 and the distance from roadsides was arbitrarily labelled as 500 m due to previous evidence for the lack of vehicular noise influence on bird population past 400 m from roadsides (Forman et al. 2002).Next, following verification that vehicular noise impacts were consistent across species richness levels, we then tested the nine acoustic indices across all 442 replicate soundscapes created with four species using linear mixed-effects models.These models predicted the respective acoustic index using the same variables as the first nine mixedeffects models, excluding the number of species as this was constant for all soundscapes.As with the previous models, control soundscapes (n = 34) were treated as the reference treatments of each model and indicator variables were treated as categorical factors.Estimated regression coefficients were then used to determine the relative impact of their respective categorical variable levels on acoustic index values.
To illustrate how acoustic index biases caused by vehicular noise interference may affect biodiversity estimates derived from acoustic indices, we predicted acoustic index values at different vehicular noise levels and calculated the number of added 'species equivalents', representing the number of vocalising bird species necessary to bias an acoustic index value to the same degree as the vehicular noise addition.Relationships between species richness and acoustic indices are often inconsistent in magnitude, making it difficult and perhaps inappropriate to correlate acoustic indices to specific species richness values (Bradfer-Lawrence et al. 2023).In turn, species equivalent values should not be equated to species richness, but rather serve to illustrate the magnitude of acoustic index vias caused by added vehicular noise as opposed to added bird vocalisations.The Bioacoustic index (BI) was excluded from analysis due to its non-linear correlation with the number of species resulting in inconsistent species equivalent values (Figure 3, see Appendix 1 for more details).For each of the remaining eight acoustic indices, linear models were created using the 12 control soundscapes containing 2, 8, or 12 vocalising bird species, and 4 control soundscapes containing 4 species.These linear models predicted the number of species based on the respective acoustic index value as the fixed effect variable.The linear equations derived from these models were subsequently used to convert acoustic index values into species equivalents within the soundscape.The linear mixed-effects models, derived from the 442 soundscapes containing four species, were used in conjunction with the 'predict' function of the stats package in program R (Version 4.2.2;R Core Team 2022) to estimate acoustic index values at high (7 cars per minute at 50 m from roadside), medium (5 cars per minute at 150 m from roadside), and low (3 cars per minute at 250 m from roadside) vehicular noise interference levels.These estimated acoustic index values were then converted into species equivalent estimations using the established linear equations.Four species were subtracted from each species equivalent estimations to account for the four vocalising bird species present within each soundscape.

Species effect on acoustic indices
Within control soundscapes, species richness levels displayed a rough, linear correlation with each acoustic index, except for BI (Figure 3).For all nine indices, the number of species present in the soundscapes influenced acoustic index value consistently across all levels of vehicular noise and distance (Figure 4).That is, changes in acoustic index values caused by the addition of vehicular noise remained consistent regardless of the number of species present.In turn, it was deemed appropriate to only use soundscapes with the same number of species (four) in further analysis of the effects of vehicular noise on acoustic index values.

Vehicular noise effects
The fixed effect of the number of cars was highly significant (p < 0.01) for all nine acoustic index linear mixed-effect models derived from the 442 soundscapes containing four bird species each, indicating that the acoustic index differed across the traffic levels (Table A2).There were no significant (p > 0.05) fixed effects of distance from roadside within any of these acoustic index models, indicating that each acoustic index did not differ statistically at each distance from road, holding traffic constant at the control level (Table A2).Interactive effects differed for each of these acoustic index models (Table 2).
Generally, the introduction of vehicular noise within treatment soundscapes resulted in greater differences in acoustic index values between treatment and control soundscapes.For eight indices (excluding NP), the magnitude of deviation  (Aphalo 2021).Blue lines depict a linear regression line, while the red line depicts a first-order polynomial regression line.Acoustic indices include Bioacoustic Index (BI), Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the number of frequency peaks (NP).
from the control soundscape index values increased as the simulated distance from roadsides decreased, and the number of passing cars per minute increased (Figure 5).

Added species equivalents
All eight examined acoustic indices produce similarly biased species equivalent values at low vehicular noise levels, indicating that even small amounts of vehicular noise interference can bias acoustic index values compared to soundscapes containing no added vehicular noise (Figure 6).The indices H, M, AR, and NDSI all display notable divergence in added species equivalents at medium and high vehicular noise levels compared to low noise levels, while ACI displays divergence only at high noise levels (Figure 6).The indices ADI, AEI, and NP displayed consistent added species equivalents regardless of vehicular noise level (Figure 6).Responses and standard errors were obtained using the 'effect' function of the 'effects' package in R (Fox and Hong 2009), in which the number of species and the interaction term were used as predictors.Colours indicate different species richness levels tested.Acoustic indices include Bioacoustic Index (BI), Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the Number of Frequency Peaks (NP).

Discussion
Although PAM offers a promising path forward for rapid biodiversity assessments, it may be limited to areas with minimal anthropogenic noise if the impacts of anthrophony are not understood and accounted for (Fairbrass et al. 2017;Alcocer et al. 2022).Using simulated bird assemblage soundscapes featuring bird vocalisations and traffic noise, we offer a quantitative perspective of how varying degrees of vehicular noise introduction directly affects nine acoustic indices commonly used to summarise biological information present in PAM recordings.Our results provide support for several theoretical and observed correlations between acoustic indices in the presence of anthropogenic noise (Joo et al. 2011;Kasten et al. 2012;Wa Maina et al. 2016;Fairbrass et al. 2017).While the analysed acoustic indices are designed to be a function of the level of biotic noise within an environment, we observed that acoustic index deviations caused by vehicular noise addition are consistent in magnitude regardless of the number of vocalising bird species present in the soundscapes.This absence of interaction between the effects of vehicular noise and biotic noise on acoustic index values supports the applicability of using the retrieved regression coefficients across different levels of avian species richness.In turn, our results provide a quantitative, controlled perspective on how vehicular noise can influence acoustic index values derived from PAM efforts.
Adding vehicular sound in a controlled way that reflects the acoustic qualities of recorded soundscapes with varying anthrophony settings allowed us to offer guidance to researchers who intend to use acoustic indices in PAM efforts.Although researchers using acoustic indices may avoid areas with high vehicular noise levels (Machado et al. 2017;Mammides et al. 2017), we found that BI, ACI, ADI, and AEI can all be implemented at a distance of 150 m at low traffic levels and 250 m at high traffic levels without significant influence from vehicular noise on their outputs.As such, we recommend that ARUs should be deployed at these distances or greater for PAM initiatives intending to use these four acoustic indices.Importantly, the indices of ADI, and AEI also produced small and consistent changes in species equivalent calculations at all evaluated traffic levels, lending support for their usage as biodiversity proxies even in environments where vehicular noise pollution is present.These indices are derived from mathematical principles that allow for greater resistance to vehicular noise.The default minimum and maximum frequency limits of BI (2000 Hz and 8000 Hz) filter out most of the noise added by the vehicle sound clip, leaving the bird vocalisations to be the primary sound input for BI calculations.Similarly, ACI was developed to detect the rapid amplitude variations within individual frequency bins, a feature typical of bird songs and not characteristic of most vehicular noises (Pieretti et al. 2011).Additionally, passing traffic produces sounds that span across multiple frequency bins used in the calculations of ADI and AEI.Because of this, the effect of added vehicle noise on index outputs may be dampened, for example, in comparison to added bird vocalisations, which tend to occupy a narrow frequency band and therefore might have a greater impact on evenness calculations.All soundscapes contained a random sample of four vocalizing bird species.Each subplot corresponds to a specific acoustic index, and the facets within correspond to different distance from roads while the x-axes show variation in traffic levels.Responses and standard errors were obtained using the 'effect' function of the 'effects' package in R (Fox and Hong 2009), in which the interaction term was used as the predictor.Acoustic indices include Bioacoustic Index (BI), Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the Number of Frequency Peaks (NP).
Only one of the examined acoustic indices, NDSI, was expressively developed to detect changes in anthropogenic noise levels, a behaviour exemplified by our findings.In turn, NDSI values may be less sensitive to changes in biodiversity changes in environments with a higher degree of vehicular noise pollution (see Bradfer-Lawrence et al. 2023).As such, we discourage the use of NDSI in this context.Despite not being specifically developed to detect anthropogenic noise, the indices of M, AR, and H behaved similarly to NDSI with the addition of vehicular noise, even at considerable distances from roadsides.Because of this, we similarly advise against calculating these indices from recordings possessing prominent vehicular noise interference.
Unlike the other examined acoustic indices, the NP index performed inconsistently when exposed to anthropogenic noise and may not be suitable for PAM in developed areas.Specifically, NP does not display a clear directional change in magnitude as vehicular noise is added.We speculate this may be the result of NP being the only analysed index in which the output is a whole integer ranging from 0 to 9 in the case of this study, which is a notably narrower range of possible outputs compared to the other analysed indices.This suggests the need for further research to gain a more comprehensive understanding of the biological interpretations associated with each NP integer value, which may necessitate a larger sample of audio inputs with various soundscape characteristics.Furthermore, while marine anthrophony has also resulted in inconsistent NP outputs (Ferguson et al. 2023), no study to date has thoroughly examined the behaviour of NP under varying terrestrial anthropogenic noise circumstances, indicating a significant gap in our current knowledge.
Two of the analysed acoustic indices (ADI and AEI) display a unique behaviour in which the directional (i.e.positive or negative) change in value caused by increasing noise perturbance depends on the length of the acoustic density gradient present in each recording.Specifically, the 50 m car drive-by sound clip exhibits a more pronounced amplitude change over a shorter period compared to the 150 m and 250 m sound clips, which have a slower progression of amplitude change.In line with the explanation provided by Eldridge et al. (2018), we believe that the 'flatter envelope' of the 150 m and 250 m sound clips yields a higher entropy calculation, causing increased AEI and decreased ADI as more of these sound clips are incorporated into the soundscape.Conversely, the elevated acoustic activity present in the 50 m sound clip may imply an entropy resembling a quiet signal, thus causing inverted ADI and AEI trends as more cars are introduced.
While our study provides valuable insights into the influence of vehicular noise on acoustic indices commonly used in PAM efforts, it is important to recognise that these bird assemblage soundscapes simplify the complexity of real-world ecosystems and anthropogenic noise sources.The incorporation of vehicular noise in a controlled manner reflects an attempt to capture the acoustic qualities of real-world recordings with varying levels of anthropogenic noise.Nonetheless, the intricacies of natural soundscapes, including the variability in bird species composition, non-avian biophony, and the temporal dynamics of anthropogenic noise, may not be fully captured in our simulations.
Differing methodologies used to develop simulated soundscapes can have important implications for resulting outputs.For example, H displayed a notably wider value range (0.37-0.95) and a positive correlation with species richness in a different simulated environment (Sueur et al. 2008), in contrast to our findings in which H had a more condensed range and a negative correlation with increased bird vocalisations.In turn, comparing acoustic index outputs in different simulated conditions, such as with different noise pollution sources or taxa of interest, could yield new insights into their behaviours.In another simulated environment, NP and AR were least affected by constant background noise sources (e.g.insect drone, wind) among a range acoustic indices (Gasc et al. 2015).This contrasts our findings, where both NP and AR were influenced by the addition of vehicular background noise, even at further distances and lower traffic levels.This could indicate that NP and AR are more robust to certain background noises that are more consistent in magnitude (e.g.insect noise, geophony) and more prone to the effects of irregular background noise sources, such as vehicles.Clear interpretation of acoustic index values requires an understanding of how such indices will behave when derived from different sound sources and overall soundscape patterns (Bradfer-Lawrence et al. 2023).Thus, we underscore the necessity of comparing outcomes and methodologies from several simulated soundscapes and observational studies to assess which findings are justifiably applicable to future research initiatives.

Conclusion
Past empirical research has underscored the potential bias that vehicular noise can introduce acoustic index values derived from PAM efforts, prompting many wildlife ecologists to caution against the use of acoustic indices in human-developed environments (Fuller et al. 2015;Fairbrass et al. 2017;Alcocer et al. 2022).While biases are discernible across all examined acoustic indices, four indices (BI, ACI, ADI, and AEI) displayed heightened resistance to added vehicular noise at moderate and extended distances from roadsides.In turn, we provide evidence for the implementation of these indices as a viable option for ecologists working in environments in which roadways may contribute to noise pollution within ARU recordings.While our findings should be interpreted in conjunction with other behavioural observations of acoustic indices in various development contexts, we believe that our results will better inform use of acoustic indices in future ecological monitoring initiatives.

Figure 2 .
Figure 2.An illustrative example of one of our bird assemblage soundscapes (top), chosen at random, with four vocalizing bird species, three passing cars per minute, at 150 m from the roadside alongside an AudioMoth recording (bottom) with four vocalizing bird species as detected using over-ear headphones, a distance of 131 m from the roadside, and an AADT of 4550 (i.e.3.16 cars per minute) as reported by the Illinois department of transportation (2021) ('Illinois highway system file').

Figure 3 .
Figure 3. Values for nine acoustic indices as predicted by number of vocalizing bird species.Predicted values were obtained from n = 16 control soundscapes containing only bird vocalizations and pink noise.Each subplot corresponds to a specific acoustic index.Regression lines and R-squared values were obtained using the 'ggpmosc' package in R(Aphalo 2021).Blue lines depict a linear regression line, while the red line depicts a first-order polynomial regression line.Acoustic indices include Bioacoustic Index (BI), Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the number of frequency peaks (NP).

Figure 4 .
Figure 4. Predicted values for nine acoustic indices as a function of cars per minute, distance from roadside, and the number of vocalizing bird species.Each subplot corresponds to a specific acoustic index, and the facets within correspond to different traffic levels and distance from roads.Responses and standard errors were obtained using the 'effect' function of the 'effects' package in R(Fox and Hong 2009), in which the number of species and the interaction term were used as predictors.Colours indicate different species richness levels tested.Acoustic indices include Bioacoustic Index (BI), Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the Number of Frequency Peaks (NP).

Figure 5 .
Figure 5. Values for nine acoustic indices as predicted by cars per minute and distance from roadside.All soundscapes contained a random sample of four vocalizing bird species.Each subplot corresponds to a specific acoustic index, and the facets within correspond to different distance from roads while the x-axes show variation in traffic levels.Responses and standard errors were obtained using the 'effect' function of the 'effects' package in R(Fox and Hong 2009), in which the interaction term was used as the predictor.Acoustic indices include Bioacoustic Index (BI), Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the Number of Frequency Peaks (NP).

Figure 6 .
Figure 6.Added species equivalents caused by the addition of vehicular noise to simulated bird assemblage soundscapes.Species equivalents are the hypothetical number of vocalizing bird species necessary to bias an acoustic index value by the same degree as the added vehicular noise interference.Traffic levels are as follows: low (3 cars per minute at 250 m from roadside), medium (5 cars per minute at 150 m from roadside), and high (7 cars per minute at 50 m from roadside).The middle of the crossbar represents the species equivalent estimation, while the ends of the crossbar correspond to standard error estimations derived from the 'se.Fit = TRUE' argument of 'predict' function of the 'stats' package (R Core Team 2022).Acoustic indices include Acoustic Entropy Index (H), Acoustic Complexity Index (ACI), Acoustic Diversity Index (ADI), Acoustic Evenness Index (AEI), Amplitude Index (M), Acoustic Richness Index (AR), Normalized Difference Soundscape Index (NDSI), and the Number of Frequency Peaks (NP).

Table 1 .
Summarization of acoustic indices and their respective interpretations assessed with 598 simulated bird assemblage soundscapes.Acoustic indices listed in chronological order of original publication date.

Table 2 .
Linear regression coefficients of the interactive effect of cars per minute and distance from roadside for nine acoustic indices.Bold text indicates a p-value less than 0.05.Positive coefficients indicate an increase in the predicted acoustic index for that traffic level and distance from roadside, while negative coefficients indicate a decrease in the index.Values are rounded to two decimal places.