This function uses neuronbridge_search to get potential matches for data items (i.e. neurons/lines) that you wish to search (search). It also fetches the same information on data items you would rather not have in your 'hits', i.e. hemibrain connectome neurons that you do not want to 'also'-be in the lines that come up when you search for you neurons in search. Depending on your use-case, you may want to read a series of table with neuronbridge_search rather than use this function, to filter your matches. See details below.

neuronbridge_avoid(search, avoid, version = "v2_1_1", threshold = 23000)

## Arguments

search data item IDs (in neuronbridge_ids) for which you want to find hits. data item IDs (in neuronbridge_ids) that you would rather not have in your hits. the precomputed scores to search. For example, "v2_1_1" refers to this release. LM-EM matches with a normalizedScore below this value are not returned.

## Value

a data.frame of hits. Each row indicates a separate MIP file with its own nb.id. The data.frame is already ranked by normalizedScore. Top scores (better match) are at the top of the data frame. The columns mean:

• "publishedName" - the id for the potential hit neuron/line. I.e. specifies a genetic driver resource or a connectome neuron. these are the same ids that can be seen with neuronbridge_ids.

• "libraryName" - the data set from which this data item came.

• "imageURL" - the path on https://s3.amazonaws.com/, at which one can find a 'high res' .png of the MIP file for this data item.

• "thumbnailURL" - the path on https://s3.amazonaws.com/, at which one can find a 'low res' .jpg thumbnail of the MIP file for this data item.

• "slideCode" - the unique identifier for the sample from which the MIP came. The first number indicates the date the image was taken by FlyLight.

• "objective" - the magnification under which the image was taken.

• "gender" - the sex of the fly brain which this data item derives. f = female, m = male.

• "anatomicalArea" - the gross part of the nervous system images, e.g. brain or ventral nervous system.

• "alignmentSpace" - the template brain to which the image that formed this MIP, was aligned. Typically, this is the JR2018 standard template brain from Bogovic et al. 2018.

• "channel" - number of the channel from the aligned image stack that is represented by this MIP.

• "mountingProtocol" - the protocol used to prepare brain sample for imaging.

• "matchingPixels" - the number of overlapping pixels between query (searched.id) and target (nb.id).

• "gradientAreaGap " - unsure, seeking clarification from NeuronBridge

• "normalizedGapScore" - unsure, seeking clarification from NeuronBridge

• "normalizedScore" - the matching score, created by examining the overlapped pixel number and colourdepth. If the colourand xy position of the pixel match between the mask and the searching data, then the approach here will count it as a positive matching score

• "searched.id" - the nb.id you searched with, i.e. given to the function call

• "nb.id" - the 'NeuronBridge ID' for the MIP file.

neuronbridge_info, neuronbridge_mip, neuronbridge_search

## Examples

# \donttest{
if (FALSE) {

# Get helpful package with some classe hemibrain neuron IDs
if (!require("hemibrainr")) remotes::install_github("flyconnectome/hemibrainr")

library(hemibrainr)

# So this is the 'olfactory PN' neuron we want
search = "542634818"

# And we do not want these other PNs
avoid = setdiff(hemibrainr::pn.ids,search)

# Let's see what we get for ot
hits = neuronbridge_avoid(search = search, avoid = avoid)
# View(hits[1:20,])

}# }