openai-cookbook/examples/Customizing_embeddings.ipynb

15698 lines
675 KiB
Plaintext
Raw Normal View History

2022-03-11 02:08:53 +00:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "Vq31CdSRpgkI"
},
"source": [
"# Customizing embeddings\n",
"\n",
"This notebook demonstrates one way to customize OpenAI embeddings to a particular task.\n",
"\n",
"The input is training data in the form of [text_1, text_2, label] where label is +1 if the pairs are similar and -1 if the pairs are dissimilar.\n",
"\n",
"The output is a matrix that you can use to multiply your embeddings. The product of this multiplication is a 'custom embedding' that will better emphasize aspects of the text relevant to your use case. In binary classification use cases, we've seen error rates drop by as much as 50%.\n",
"\n",
"In the following example, I use 1,000 sentence pairs picked from the SNLI corpus. Each pair of sentences are logically entailed (i.e., one implies the other). These pairs are our positives (label = 1). We generate synthetic negatives by combining sentences from different pairs, which are presumed to not be logically entailed (label = -1).\n",
"\n",
"For a clustering use case, you can generate positives by creating pairs from texts in the same clusters and generate negatives by creating pairs from sentences in different clusters.\n",
"\n",
"With other data sets, we have seen decent improvement with as little as ~100 training examples. Of course, performance will be better with more examples."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "arB38jFwpgkK"
},
"source": [
"# 0. Imports"
]
},
{
"cell_type": "code",
"execution_count": 1,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "ifvM7g4apgkK"
},
"outputs": [],
"source": [
"# imports\n",
"from typing import List, Tuple # for type hints\n",
"\n",
"import numpy as np # for manipulating arrays\n",
"import pandas as pd # for manipulating data in dataframes\n",
"import pickle # for saving the embeddings cache\n",
"import plotly.express as px # for plots\n",
"import random # for generating run IDs\n",
"from sklearn.model_selection import train_test_split # for splitting train & test data\n",
"import torch # for matrix optimization\n",
"\n",
"from openai.embeddings_utils import get_embedding, cosine_similarity # for embeddings\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "DtBbryAapgkL"
},
"source": [
"## 1. Inputs\n",
"\n",
"Most inputs are here. The key things to change are where to load your datset from, where to save a cache of embeddings to, and which embedding engine you want to use.\n",
"\n",
"Depending on how your data is formatted, you'll want to rewrite the process_input_data function."
]
},
{
"cell_type": "code",
"execution_count": 2,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "UzxcWRCkpgkM"
},
"outputs": [],
"source": [
"# input parameters\n",
"embedding_cache_path = \"data/snli_embedding_cache.pkl\" # embeddings will be saved/loaded here\n",
"default_embedding_engine = \"babbage-similarity\" # choice of: ada, babbage, curie, davinci\n",
2022-03-11 02:08:53 +00:00
"num_pairs_to_embed = 1000 # 1000 is arbitrary - I've gotten it to work with as little as ~100\n",
"local_dataset_path = \"data/snli_1.0_train_2k.csv\" # download from: https://nlp.stanford.edu/projects/snli/\n",
2022-03-11 02:08:53 +00:00
"\n",
"\n",
"def process_input_data(df: pd.DataFrame) -> pd.DataFrame:\n",
" # you can customize this to preprocess your own dataset\n",
" # output should be a dataframe with 3 columns: text_1, text_2, label (1 for similar, -1 for dissimilar)\n",
" df[\"label\"] = df[\"gold_label\"]\n",
2022-03-11 02:08:53 +00:00
" df = df[df[\"label\"].isin([\"entailment\"])]\n",
" df[\"label\"] = df[\"label\"].apply(lambda x: {\"entailment\": 1, \"contradiction\": -1}[x])\n",
" df = df.rename(columns={\"sentence1\": \"text_1\", \"sentence2\": \"text_2\"})\n",
" df = df[[\"text_1\", \"text_2\", \"label\"]]\n",
" df = df.head(num_pairs_to_embed)\n",
" return df\n"
2022-03-11 02:08:53 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "aBbH71hEpgkM"
},
"source": [
"## 2. Load and process input data"
]
},
{
"cell_type": "code",
"execution_count": 3,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "kAKLjYG6pgkN",
"outputId": "dc178688-e97d-4ad0-b26c-dff67b858966"
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/var/folders/r4/x3kdvs816995fnnph2gdpwp40000gn/T/ipykernel_17383/802273372.py:13: SettingWithCopyWarning: \n",
2022-03-11 02:08:53 +00:00
"A value is trying to be set on a copy of a slice from a DataFrame.\n",
"Try using .loc[row_indexer,col_indexer] = value instead\n",
"\n",
"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n",
" df[\"label\"] = df[\"label\"].apply(lambda x: {\"entailment\": 1, \"contradiction\": -1}[x])\n"
2022-03-11 02:08:53 +00:00
]
},
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>text_1</th>\n",
" <th>text_2</th>\n",
" <th>label</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>A person on a horse jumps over a broken down a...</td>\n",
" <td>A person is outdoors, on a horse.</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>Children smiling and waving at camera</td>\n",
" <td>There are children present</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>7</th>\n",
" <td>A boy is jumping on skateboard in the middle o...</td>\n",
" <td>The boy does a skateboarding trick.</td>\n",
" <td>1</td>\n",
2022-03-11 02:08:53 +00:00
" </tr>\n",
" <tr>\n",
" <th>14</th>\n",
" <td>Two blond women are hugging one another.</td>\n",
" <td>There are women showing affection.</td>\n",
" <td>1</td>\n",
" </tr>\n",
" <tr>\n",
" <th>17</th>\n",
" <td>A few people in a restaurant setting, one of t...</td>\n",
" <td>The diners are at a restaurant.</td>\n",
" <td>1</td>\n",
2022-03-11 02:08:53 +00:00
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" text_1 \\\n",
"2 A person on a horse jumps over a broken down a... \n",
"4 Children smiling and waving at camera \n",
"7 A boy is jumping on skateboard in the middle o... \n",
"14 Two blond women are hugging one another. \n",
"17 A few people in a restaurant setting, one of t... \n",
2022-03-11 02:08:53 +00:00
"\n",
" text_2 label \n",
"2 A person is outdoors, on a horse. 1 \n",
"4 There are children present 1 \n",
"7 The boy does a skateboarding trick. 1 \n",
"14 There are women showing affection. 1 \n",
"17 The diners are at a restaurant. 1 "
2022-03-11 02:08:53 +00:00
]
},
"execution_count": 3,
2022-03-11 02:08:53 +00:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# load data\n",
"df = pd.read_csv(local_dataset_path)\n",
"\n",
"# process input data\n",
"df = process_input_data(df) # this demonstrates training data containing only positives\n",
"\n",
"# view data\n",
"df.head()\n"
2022-03-11 02:08:53 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "z2F1cCoYpgkO"
},
"source": [
"## 3. Split data into training test sets\n",
"\n",
"Note that it's important to split data into training and test sets *before* generating synethetic negatives or positives. You don't want any text strings in the training data to show up in the test data. If there's contamination, the test metrics will look better than they'll actually be in production."
]
},
{
"cell_type": "code",
"execution_count": 4,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "50QmnH2qpgkO",
"outputId": "6144029b-eb29-439e-9990-7aeb28168e56"
},
"outputs": [],
2022-03-11 02:08:53 +00:00
"source": [
"# split data into train and test sets\n",
"test_fraction = 0.5 # 0.5 is fairly arbitrary\n",
"random_seed = 123 # random seed is arbitrary, but is helpful in reproducibility\n",
"train_df, test_df = train_test_split(\n",
" df, test_size=test_fraction, stratify=df[\"label\"], random_state=random_seed\n",
")\n",
"train_df.loc[:, \"dataset\"] = \"train\"\n",
"test_df.loc[:, \"dataset\"] = \"test\"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "MzAFkA2opgkP"
},
"source": [
"## 4. Generate synthetic negatives\n",
"\n",
"This is another piece of the code that you will need to modify to match your use case.\n",
"\n",
"If you have data with positives and negatives, you can skip this section.\n",
"\n",
"If you have data with only positives, you can mostly keep it as is, where it generates negatives only.\n",
"\n",
"If you have multiclass data, you will want to generate both positives and negatives. The positives can be pairs of text that share labels, and the negatives can be pairs of text that do not share labels.\n",
"\n",
"The final output should be a dataframe with text pairs, where each pair is labeled -1 or 1."
]
},
{
"cell_type": "code",
"execution_count": 5,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "rUYd9V0zpgkP"
},
"outputs": [],
"source": [
"# generate negatives\n",
"def dataframe_of_negatives(dataframe_of_positives: pd.DataFrame) -> pd.DataFrame:\n",
" \"\"\"Return dataframe of negative pairs made by combining elements of positive pairs.\"\"\"\n",
" texts = set(dataframe_of_positives[\"text_1\"].values) | set(\n",
" dataframe_of_positives[\"text_2\"].values\n",
" )\n",
2022-03-11 02:08:53 +00:00
" all_pairs = {(t1, t2) for t1 in texts for t2 in texts if t1 < t2}\n",
" positive_pairs = set(\n",
" tuple(text_pair)\n",
" for text_pair in dataframe_of_positives[[\"text_1\", \"text_2\"]].values\n",
" )\n",
2022-03-11 02:08:53 +00:00
" negative_pairs = all_pairs - positive_pairs\n",
" df_of_negatives = pd.DataFrame(list(negative_pairs), columns=[\"text_1\", \"text_2\"])\n",
" df_of_negatives[\"label\"] = -1\n",
" return df_of_negatives\n"
2022-03-11 02:08:53 +00:00
]
},
{
"cell_type": "code",
"execution_count": 6,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "Rkh8-J89pgkP"
},
"outputs": [],
"source": [
"negatives_per_positive = (\n",
" 1 # it will work at higher values too, but more data will be slower\n",
")\n",
2022-03-11 02:08:53 +00:00
"# generate negatives for training dataset\n",
"train_df_negatives = dataframe_of_negatives(train_df)\n",
"train_df_negatives[\"dataset\"] = \"train\"\n",
"# generate negatives for test dataset\n",
"test_df_negatives = dataframe_of_negatives(test_df)\n",
"test_df_negatives[\"dataset\"] = \"test\"\n",
"# sample negatives and combine with positives\n",
"train_df = pd.concat(\n",
" [\n",
" train_df,\n",
" train_df_negatives.sample(\n",
" n=len(train_df) * negatives_per_positive, random_state=random_seed\n",
" ),\n",
" ]\n",
")\n",
"test_df = pd.concat(\n",
" [\n",
" test_df,\n",
" test_df_negatives.sample(\n",
" n=len(test_df) * negatives_per_positive, random_state=random_seed\n",
" ),\n",
" ]\n",
")\n",
2022-03-11 02:08:53 +00:00
"\n",
"df = pd.concat([train_df, test_df])\n"
2022-03-11 02:08:53 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "8MVSLMSrpgkQ"
},
"source": [
"## 5. Calculate embeddings and cosine similarities\n",
"\n",
"Here, I create a cache to save the embeddings. This is handy so that you don't have to pay again if you want to run the code again."
]
},
{
"cell_type": "code",
"execution_count": 7,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "R6tWgS_ApgkQ"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Getting embedding for Three wheelchair basketball players wearing team uniforms are attempting to reach the descending basketball with other players in the background.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A person in a red coat and a running black and brown dog.\n",
"Getting embedding for Five men, one wearing a white shirt standing on something, hanging up a picture of a child.\n",
"Getting embedding for A busy city that looks like New York City.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for Emergency personnel looking into the back of a car.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for A man with a red shirt is watching another man who is standing on top of a attached cart filled to the top.\n",
"Getting embedding for High fashion ladies wait outside a tram beside a crowd of people in the city.\n",
"Getting embedding for A young man and young lady dancing on a carpeted floor with a picture from the movie Toy Story on a big screen in the background.\n",
"Getting embedding for Two tan and white dogs and one tan dog racing down the beach near the water.\n",
"Getting embedding for Two women, each embracing a little girl, catch up at a small family gathering.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A man wearing a white shire shirt and hate is riding a bucking horse in a rodeo.\n",
"Getting embedding for A man and a woman are holding hands as they walk along a city sidewalk.\n",
"Getting embedding for A young woman packs belongings into a black suitcase.\n",
"Getting embedding for A man with a red shirt is watching another man who is standing on top of a attached cart filled to the top.\n",
"Getting embedding for A main street scene of a small town with an overhead welcome sign that says \"Welcome to Golden\".\n",
"Getting embedding for Little girl walking along a dirt, rock, and grass path wearing a blue hat many steps behind two people wearing backpacks and holding umbrellas.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A couple playing with a little boy on the beach.\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for A man in a gold skirt sits in front of the computer.\n",
"Getting embedding for Soccer teams play on a field as the sun sets behind a line of trees.\n",
"Getting embedding for two little girls, one in a green jacket and one in a pink jacket, and a little boy in a green jacket holding an apple sitting on a rock.\n",
"Getting embedding for A man in an elf hat holding a white umbrella is standing on the sidewalk with two other men.\n",
"Getting embedding for Mothers with children talking at a gathering.\n",
"Getting embedding for Workers are sitting next to a work zone eating food.\n",
"Getting embedding for Three young children consisting of two girls and a boy who is holding an apple with a bite out of it, are posing on a scenic mountain view background.\n",
"Getting embedding for A small group of church-goers watch a choir practice.\n",
"Getting embedding for A family between a van and fence\n",
"Getting embedding for Two men with heads down signing a paper.\n",
"Getting embedding for A woman wearing orange looking upward.\n",
"Getting embedding for A woman with dark hair is wearing a green sweater.\n",
"Getting embedding for A brown dog running with two white and brown dogs on the seashore with crashing waves behind them.\n",
"Getting embedding for A soccer match between a team with white jerseys, and a team with yellow jerseys.\n",
"Getting embedding for A mountain biker jumping a slope outdoors in a forest area.\n",
"Getting embedding for The Arsenal football club warms-up on the soccer field as a few fans watch.\n",
"Getting embedding for A man in blue shorts and without a shirt is jogging down the road while listening to his iPod.\n",
"Getting embedding for Three girls on an amusement ride enjoying themselves.\n",
"Getting embedding for A woman wearing a green headscarf laughs while a woman in the background paddles a boat.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for Group of people dancing\n",
"Getting embedding for A man wearing a blue shirt is sitting on a brick planter next to some paintings.\n",
"Getting embedding for People listening to a choir in a Catholic church.\n",
"Getting embedding for A driver is racing his Ford vehicle on a gravel track.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for A skier slides along a metal rail.\n",
"Getting embedding for A good-looking firefighter sets up \"Do Not Cross\" tape in the city.\n",
"Getting embedding for Three young children consisting of two girls and a boy who is holding an apple with a bite out of it, are posing on a scenic mountain view background.\n",
"Getting embedding for Closeup image of a dog swimming.\n",
"Getting embedding for An english farmer with a horse pulled wagon.\n",
"Getting embedding for A man in a white shirt hangs a painting in a run down store while other men watch.\n",
"Getting embedding for two female medical personnel read their charts.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for The dogs run and play with a red ball.\n",
"Getting embedding for Two men in wheelchairs are reaching in the air for a basketball.\n",
"Getting embedding for Men fish on a concrete slab.\n",
"Getting embedding for Three guys and a girl are all jumping in a pool together.\n",
"Getting embedding for Busy ChinaTown street corner where people are walking past an open front store.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for The skier is wearing a yellow jumpsuit and sliding across a yellow rail.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for man sitting down playing a game of chess alone\n",
"Getting embedding for Four teenage boys are doing cannonballs into a swimming pool.\n",
"Getting embedding for Little girl in a blue and yellow plaid outfit and blue hat is running along the trail.\n",
"Getting embedding for Men in uniform work together.\n",
"Getting embedding for A blond child is pulling a wagon with a little blond boy in it.\n",
"Getting embedding for Busy Japanese intersection like maybe Tokyo.\n",
"Getting embedding for A model posing to look as if she's a real female soccer player.\n",
"Getting embedding for 1 little boy wearing a pirate costume following closely behind a little girl wearing a blue dress carrying a orange pumpkin bucket and walking down the sidewalk.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A bunch of people playing soccer.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for Five men, one wearing a white shirt standing on something, hanging up a picture of a child.\n",
"Getting embedding for People sit and relax next to a pool in a plaza.\n",
"Getting embedding for A young boy paddles across the water in a makeshift boat.\n",
"Getting embedding for A woman is waiting with children as she is checked out at Walmart.\n",
"Getting embedding for There is a woman holding a baby, along with a man with a save the children bag.\n",
"Getting embedding for A soccer game occurring at sunset.\n",
"Getting embedding for A groom and a bride are standing on the grass with his hand on her waist.\n",
"Getting embedding for a motorcyclist does a nose wheelie.\n",
"Getting embedding for A man with a red shirt is watching another man who is standing on top of a attached cart filled to the top.\n",
"Getting embedding for Small blond-haired girl drinking a glass of juice.\n",
"Getting embedding for Five men, one wearing a white shirt standing on something, hanging up a picture of a child.\n",
"Getting embedding for A woman stands behind an outdoor grill with a blue basket of food in her hands.\n",
"Getting embedding for People in a truck-full of sacks in a field full of sheep.\n",
"Getting embedding for An expectant woman happily lets another listen to the baby inside of her.\n",
"Getting embedding for A man wearing blue jeans and red bowling shoes stands in a bowling alley lane with a green ball in his hand.\n",
"Getting embedding for Two women, each with a child, look at each other.\n",
"Getting embedding for A young girl dancing in her socks on a wooden floor strewn with pink balloons.\n",
"Getting embedding for A climber is making his way up a snowy mountainside.\n",
"Getting embedding for Five men, one wearing a white shirt standing on something, hanging up a picture of a child.\n",
"Getting embedding for A woman in costume is marching with a large drum.\n",
"Getting embedding for Two men cook together with a metal bowl, near a hanging plant.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A fuzzy white lap dog runs along a rocky beach.\n",
"Getting embedding for Three men are smiling and posing behind a truck loaded with various construction supplies.\n",
"Getting embedding for A dog drops a red disc on a beach.\n",
"Getting embedding for A girl wearing a dress is blowing bubbles at a dock.\n",
"Getting embedding for A little boy underwater in a pool, holding a plastic dinosaur.\n",
"Getting embedding for A group of people sitting around a picnic table.\n",
"Getting embedding for A person on a horse jumps over a broken down airplane.\n",
"Getting embedding for A man and two women in black jackets holding umbrellas sit on a long wooden bench.\n",
"Getting embedding for Two men stand around a mixing bowl.\n",
"Getting embedding for the three boys are all holding onto a flotation device in the water.\n",
"Getting embedding for Children going home from school.\n",
"Getting embedding for Several people are dancing together in sync.\n",
"Getting embedding for A group of people sitting at a table outside talking.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for A sumo wrestler with a brown belt is pushing another wrestler in a bout.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for Two people wearing blue clothing are making hand gestures next to one another.\n",
"Getting embedding for A white bike is leaning against a post.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for An older man is drinking orange juice at a restaurant.\n",
"Getting embedding for Skydivers in formation.\n",
"Getting embedding for A black dog swimming in water near rocks.\n",
"Getting embedding for A view of buildings and people walking across the streets in Times Square, New York City.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for a woman wearing a Chinese straw hat operating some sort of stainless steel machine in what appears to be a park.\n",
"Getting embedding for Two men share a laugh while in the kitchen.\n",
"Getting embedding for A little kid enjoying some sledding on a winter day.\n",
"Getting embedding for A child with a brightly colored shirt plays outside.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for A woman and a child holding on to the railing while on trolley.\n",
"Getting embedding for A gentleman in a purple scarf and hat is looking at money while holding an accordion.\n",
"Getting embedding for Toddler in striped sweatshirt plays on rope on playground.\n",
"Getting embedding for A boy in a red and blue shirt painting a log.\n",
"Getting embedding for An elderly couple dance in front of a juke box while a guy in shorts sleeps at a nearby table\n",
"Getting embedding for Two women and one man sit on a bench.\n",
"Getting embedding for A man in a red and black jacket, blue shirt, lots of silver necklaces, and his blue jeans falling down, checks out a woman wearing a black leather jacket, yellow bra, pink fingerless gloves, and sunglasses.\n",
"Getting embedding for A young man in a red quilted vest displays an assortment of silver pendants around his neck as he watches a woman in a yellow bikini top, a black jacket, and bright pink fingerless gloves go by.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for A woman in a green jacket and hood over her head looking towards a valley.\n",
"Getting embedding for A baseball player is about to throw a baseball.\n",
"Getting embedding for Two elderly men having a conversation, snow covered grass in the background.\n",
"Getting embedding for A man is using his computer while seated at a desk.\n",
"Getting embedding for Firemen emerge from a subway station.\n",
"Getting embedding for A female violinist surrounded by other violinists.\n",
"Getting embedding for Soccer players on a field from a distance.\n",
"Getting embedding for A man sitting on a scooter on the curb.\n",
"Getting embedding for Motorcyclist performing while two men watch.\n",
"Getting embedding for Indian couple holding child near riverbank.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for A woman sitting in a laundromat looking at the camera.\n",
"Getting embedding for Two men are on scaffolding as they paint above a storefront while a man on the sidewalk stands next to them talking on the phone.\n",
"Getting embedding for A man is sitting with his head facing down, while other people are looking in his direction.\n",
"Getting embedding for A little boy swimming underwater with a toy in his hand.\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for A man wearing black is playing an electric guitar at a concert.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for A young man doing a trick on a skateboard down the stairs while being photographed.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for A woman in a green jacket and black sunglasses outside in a crowd.\n",
"Getting embedding for man in red canada shirt standing with three men in army uniform\n",
"Getting embedding for Two women are walking down a dirt path carrying loads on their heads.\n",
"Getting embedding for Young woman in a cafe checking her cellphone.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for A mother with her four children.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for Older couple posing for a picture in front of a fountain.\n",
"Getting embedding for The boy locked the cycle and went away.\n",
"Getting embedding for People waiting to get on a train or just getting off.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for A crowded street, in an Asian country, where the buildings are dominated by the Seiko building.\n",
"Getting embedding for A boat worker securing line.\n",
"Getting embedding for A man wearing a red sweater is sitting on a car bumper watching another person work.\n",
"Getting embedding for A man walking along side a clean up crew.\n",
"Getting embedding for A blond woman with her hair up is taking off a white sweatshirt.\n",
"Getting embedding for A man with a bright green shirt is talking to a woman in a pink shirt.\n",
"Getting embedding for A large group, wearing pink shirts, waves to onlookers.\n",
"Getting embedding for A woman in a white dress with a tiara sings in a chorus, which has a row of men in sailor hats.\n",
"Getting embedding for A little boy in brown pants is playing on ropes at a park.\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for A person on skis on a rail at night.\n",
"Getting embedding for A man with a gray shirt holds a young infant in his hands.\n",
"Getting embedding for A young man in his mid twenties is kicking his left foot about two feet off the leaf covered ground, with paved asphalt and green plants and trees in the background.\n",
"Getting embedding for A man is singing into a microphone.\n",
"Getting embedding for A man in ruffles pushes a stroller through a park.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for A man sitting in a barber shop.\n",
"Getting embedding for The surfer catches a big wave but stays on his board.\n",
"Getting embedding for A little boy swims underwater.\n",
"Getting embedding for Oddly dressed man walking down the street.\n",
"Getting embedding for Cars are passing through a town.\n",
"Getting embedding for Child with pink strings on head dancing surrounded by confetti, balloons.\n",
"Getting embedding for A gray-haired woman in a blue dress coat with snowflakes on it balances something on her head.\n",
"Getting embedding for A man walking proudly down the street.\n",
"Getting embedding for An average looking man is playing the guitar.\n",
"Getting embedding for A young woman tries to stick her foot in a fountain.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A spotted black and white dog splashes in the water.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for A cowboy Roping a calf in a rodeo.\n",
"Getting embedding for A little boy drinks milk and gets milk all over his face and table.\n",
"Getting embedding for Woman with green sweater and sunglasses smiling\n",
"Getting embedding for A dog jumps to catch a toy in the snow.\n",
"Getting embedding for A woman on the side of a street is making food on her cart.\n",
"Getting embedding for A young woman tries to stick her foot in a fountain.\n",
"Getting embedding for A smiling man cooks something delicious.\n",
"Getting embedding for A woman in colorful native attire featuring a blue shirt with a colorful design displays her dark hair braided with red ribbons.\n",
"Getting embedding for Two older men are talking.\n",
"Getting embedding for Three dogs in different shades of brown and white biting and licking each other.\n",
"Getting embedding for A man and a woman being intimate, with their legs in the water.\n",
"Getting embedding for A dog is chasing a ball in a backyard.\n",
"Getting embedding for A male painting a scene in front of him.\n",
"Getting embedding for Two young girls are playing outside in a non-urban environment.\n",
"Getting embedding for a couple are holding hands behind their backs while walking down a street, and the man has his arm around her shoulder.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for A young woman is playing the violin.\n",
"Getting embedding for A man in costume is ringing a bell.\n",
"Getting embedding for Tourists waiting at a train stop.\n",
"Getting embedding for two small girls walk along the leaves.\n",
"Getting embedding for Two pre-teen girls listening to music on an MP3 player with headphones.\n",
"Getting embedding for The furry brown dog is swimming in the ocean.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for Little Girl in brown shirt and blue jean skirt dances on wood floor.\n",
"Getting embedding for A man holding a green bowling ball stands by the ball return machine in a bowling alley.\n",
"Getting embedding for A Little League team tries to catch a runner sliding into a base in an afternoon game.\n",
"Getting embedding for A row of legs and black boots with a boy sitting at the end of the row.\n",
"Getting embedding for Two people loading brush on a trailer attached to a truck.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for Three men standing on grass by the water looking at something on a table.\n",
"Getting embedding for An oddly dressed man pushing a stroller down a sidewalk in a park.\n",
"Getting embedding for Various people hanging around outside of a building.\n",
"Getting embedding for A man is painting a portrait of an outside scene that includes a street sign with a bicycle chained to it.\n",
"Getting embedding for a skateboarder skates in the pool.\n",
"Getting embedding for A big brown dog swims towards the camera.\n",
"Getting embedding for A shot-on-goal action photo of soccer players in red and black uniforms.\n",
"Getting embedding for Three people are sitting on a bench.\n",
"Getting embedding for Two young men drink beer, leaning on a graffitied wall.\n",
"Getting embedding for A man in a tan suit is using a pay phone to make a call.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for A blond woman with two children is checking out at a Walmart register.\n",
"Getting embedding for Women exercising one woman has a green mat and black outfit on.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for A man in a striped polo shirt is pointing and smiling.\n",
"Getting embedding for A mountain biker jumping a slope outdoors in a forest area.\n",
"Getting embedding for A small white dog running on a pebble covered beach.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A young Asian man sits behind a set chessboard waiting for the other player to arrive.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for Two men are standing outside and snow is on the ground.\n",
"Getting embedding for Two people standing in front of a large statue of a woman, other statues and busts visible in the background.\n",
"Getting embedding for One biker is running with their bike while another is riding around them.\n",
"Getting embedding for A woman in a blue shirt and green hat looks up at the camera.\n",
"Getting embedding for a young man wearing a backpack and sunglasses is walking towards a shopping area.\n",
"Getting embedding for A man and woman watching two kids while the man holds a balloon.\n",
"Getting embedding for A child stoops to pick up a watermelon from a large pile of them.\n",
"Getting embedding for A woman is sitting at an outdoor dining table.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for A man, wearing revolutionary period clothes, is ringing a bell.\n",
"Getting embedding for A man carrying a load of fresh direct boxes on car with wheels in the city streets, as a woman walks towards him.\n",
"Getting embedding for A man parasails in the choppy water.\n",
"Getting embedding for A crowded street, in an Asian country, where the buildings are dominated by the Seiko building.\n",
"Getting embedding for A foreign family is walking along a dirt path next to the water.\n",
"Getting embedding for Two girls, each in a dress walking together.\n",
"Getting embedding for A black-and-white dog carries a stick in his mouth as he swims in the clear water.\n",
"Getting embedding for Stacks of neatly folded clothing cover most of this floor while a woman with a beige shirt and jeans busily fills a suitcase.\n",
"Getting embedding for Woman at Walmart check-out having her groceries bagged by an employee.\n",
"Getting embedding for A doctor checks the stomach of a toddler.\n",
"Getting embedding for A couple, wearing black, burgundy, and white, dance.\n",
"Getting embedding for The young man is waiting with others on the sidewalk.\n",
"Getting embedding for A street vendor in Asia tries to bring in more customers.\n",
"Getting embedding for A man being airlifted to safety after being in danger.\n",
"Getting embedding for A man and a woman are walking on a street at the top of a hill.\n",
"Getting embedding for Wet brown dog swims towards camera.\n",
"Getting embedding for Five men, one wearing a white shirt standing on something, hanging up a picture of a child.\n",
"Getting embedding for A man riding a dirt bike\n",
"Getting embedding for Two Asian people sit at a blue table in a food court.\n",
"Getting embedding for A white dog runs along a rocky shoreline.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for A young man is performing a jump on a skateboard while another young man photographs his stunt.\n",
"Getting embedding for A couple pose in front of a fountain.\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for A man riding a dirt bike\n",
"Getting embedding for There are cars driving down a street with a sign displaying Welcome to Golden.\n",
"Getting embedding for BMX biker jumps over a ravine.\n",
"Getting embedding for A couple, who appear to be Indian or Pakistani, walk on a path beside a body of water, the mother carrying a child in a diaper, the father wrapped in a blanket with the logo of the humanitarian organization \"Save the Children.\"\n",
"Getting embedding for A track event held by J.P. Morgan Chase with security.\n",
"Getting embedding for A man in a gold foils skirt, sitting at a computer desk, looks at the camera with his hands raised to his face.\n",
"Getting embedding for Two little girls lie on the carpet next to an O made of wooden blocks.\n",
"Getting embedding for A young boy wearing a light blue jacket walks across the brick patio.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for A dog in the water carries a stick in his mouth.\n",
"Getting embedding for A barber waiting for customers.\n",
"Getting embedding for a girl wearing a blue and pink swimsuit is throwing stones into a lake.\n",
"Getting embedding for A woman is running a marathon in a park.\n",
"Getting embedding for A child with a yellow cup and milk all over his face.\n",
"Getting embedding for An excited, smiling woman stands at a red railing as she holds a boombox to one side.\n",
"Getting embedding for There are four Eastern guys working on and hanging pictures of humans.\n",
"Getting embedding for Two men are cooking in the kitchen using rice milk.\n",
"Getting embedding for A meeting of young people sitting at a conference table.\n",
"Getting embedding for Indian lady and a guy in a blue suit dancing in the sunlight.\n",
"Getting embedding for A dog is preparing to run away from a person interacting with it.\n",
"Getting embedding for Two women holding children talking to one another.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for A soccer game played by a blue team and a red team on a perfectly manicured field at dusk.\n",
"Getting embedding for Two men are painting a building while a third is walking past on his cellphone.\n",
"Getting embedding for Young woman running as two guys in the back try to catch up to her.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for A baseball player is putting all his might in to throwing a ball.\n",
"Getting embedding for Group of young women in dresses strolling on the sidewalk.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for a young girl in a flowery dress surrounded by watermelons\n",
"Getting embedding for People going for a long walk to the mountains.\n",
"Getting embedding for A group of young girls playing jump rope in the street.\n",
"Getting embedding for A woman is making a clay pot.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A mountain biker jumping a slope outdoors in a forest area.\n",
"Getting embedding for An elderly couple dances next to a table where a boy is sitting with his head down.\n",
"Getting embedding for People in a meeting setting paying attention to a speaker in an orange shirt.\n",
"Getting embedding for A lady is kneeling wearing a blue shirt.\n",
"Getting embedding for A couple holding hands walks down a street.\n",
"Getting embedding for Three small puppies bite and play together in the grass.\n",
"Getting embedding for Four people are acting behind a woman in a yellow shirt is lying on the ground.\n",
"Getting embedding for a woman in a red jacket watches as a black and brown dog runs away from her in woodland clearing.\n",
"Getting embedding for A man dances with a fire baton at night.\n",
"Getting embedding for A man stare at a passing couple while walking down the block.\n",
"Getting embedding for Two older men in winter coats talking outside of a shop with a grassy lawn covered in a light coat of snow in front of it.\n",
"Getting embedding for A man and woman are walking down the street holding hands.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for Hispanic woman wearing a red plaid shirt works on sewing an article of clothing.\n",
"Getting embedding for A man and a woman are holding hands.\n",
"Getting embedding for A man holding a green bowling ball stands by the ball return machine in a bowling alley.\n",
"Getting embedding for A man parasails in the choppy water.\n",
"Getting embedding for The person is surfing.\n",
"Getting embedding for A man is wearing many silver necklaces.\n",
"Getting embedding for A brown dog running with two white and brown dogs on the seashore with crashing waves behind them.\n",
"Getting embedding for A person rides a bike outdoors.\n",
"Getting embedding for 1 little boy wearing a pirate costume following closely behind a little girl wearing a blue dress carrying a orange pumpkin bucket and walking down the sidewalk.\n",
"Getting embedding for The young violinist is a woman.\n",
"Getting embedding for A young woman packs belongings into a black suitcase.\n",
"Getting embedding for People are on their bikes.\n",
"Getting embedding for A man carrying a load of fresh direct boxes on car with wheels in the city streets, as a woman walks towards him.\n",
"Getting embedding for A man is using his computer while seated at a desk.\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for A young man in his mid twenties is kicking his left foot about two feet off the leaf covered ground, with paved asphalt and green plants and trees in the background.\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for A young lady is looking at her camera.\n",
"Getting embedding for A woman is outside.\n",
"Getting embedding for A man holding a green bowling ball stands by the ball return machine in a bowling alley.\n",
"Getting embedding for 2 women are carrying little girls.\n",
"Getting embedding for A person is performing.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for Cars are passing through a town.\n",
"Getting embedding for A woman has children with her at the check out counter.\n",
"Getting embedding for The woman is wearing green.\n",
"Getting embedding for Men watching motorcyclist.\n",
"Getting embedding for A man stares at a passing couple.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for People have bikes\n",
"Getting embedding for Firemen walking outside\n",
"Getting embedding for A man is sitting with his head facing down, while other people are looking in his direction.\n",
"Getting embedding for A young man in his mid twenties is kicking his left foot about two feet off the leaf covered ground, with paved asphalt and green plants and trees in the background.\n",
"Getting embedding for A dog swims in a body of water.\n",
"Getting embedding for The woman is outdoors.\n",
"Getting embedding for A sumo wrestler with a brown belt is pushing another wrestler in a bout.\n",
"Getting embedding for A man and a woman are walking on a street at the top of a hill.\n",
"Getting embedding for A woman is inside.\n",
"Getting embedding for A groom and bride are standing outside.\n",
"Getting embedding for A couple of people are holding hands while walking.\n",
"Getting embedding for A couple of people are loading brush onto a trailer that is attached to a truck.\n",
"Getting embedding for Three kids are sitting on a rock.\n",
"Getting embedding for A meeting of young people sitting at a conference table.\n",
"Getting embedding for A doctor checks the stomach of a toddler.\n",
"Getting embedding for Bikers stop and wait for traffic at the intersection.\n",
"Getting embedding for A man sitting in a barber shop.\n",
"Getting embedding for A young woman packs belongings into a black luggage carrier.\n",
"Getting embedding for A barber waiting for customers.\n",
"Getting embedding for People are looking at sculptures at a museum.\n",
"Getting embedding for A girl is dancing in a brown shirt and blue jean skirt on a wooden floor\n",
"Getting embedding for Three puppies are playing outside.\n",
"Getting embedding for A woman in colorful native attire featuring a blue shirt with a colorful design displays her dark hair braided with red ribbons.\n",
"Getting embedding for A man hangs a poster in front of a shop.\n",
"Getting embedding for There are lots of cars on the street.\n",
"Getting embedding for A man parasails in the choppy water.\n",
"Getting embedding for Choir singing in mass.\n",
"Getting embedding for A biker is doing jumps by trees.\n",
"Getting embedding for A man is showing a woman something\n",
"Getting embedding for J.P. Morgan Chase held a track event.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A couple playing with a little boy on the beach.\n",
"Getting embedding for A woman talks to others indoors.\n",
"Getting embedding for A group of people sitting at a table outside talking.\n",
"Getting embedding for A child with a brightly colored shirt plays outside.\n",
"Getting embedding for The three young childeren were hold an apple with a bite on it\n",
"Getting embedding for A woman is outside.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A man is wearing a shirt.\n",
"Getting embedding for People are riding their bicycles.\n",
"Getting embedding for A woman is inside.\n",
"Getting embedding for Bikers stop and wait for traffic at the intersection.\n",
"Getting embedding for A woman is wearing a red jacket\n",
"Getting embedding for A man, wearing revolutionary period clothes, is ringing a bell.\n",
"Getting embedding for A man is wearing many silver necklaces.\n",
"Getting embedding for The people are holding onto the rail.\n",
"Getting embedding for A child has milk on their face.\n",
"Getting embedding for A man walking proudly down the street.\n",
"Getting embedding for An english farmer with a horse pulled wagon.\n",
"Getting embedding for A child with a yellow cup and milk all over his face.\n",
"Getting embedding for Small boy in pool holds toy.\n",
"Getting embedding for A man is sitting with his head facing down, while other people are looking in his direction.\n",
"Getting embedding for A child swimming.\n",
"Getting embedding for A woman is wearing a red jacket\n",
"Getting embedding for A man is hanging a picture of a child.\n",
"Getting embedding for Cheerleaders cheering.\n",
"Getting embedding for People wait for a race to begin.\n",
"Getting embedding for A woman wearing a blue shirt and green hat looks at the camera\n",
"Getting embedding for A man, wearing revolutionary period clothes, is ringing a bell.\n",
"Getting embedding for A young man in his mid twenties is kicking his left foot about two feet off the leaf covered ground, with paved asphalt and green plants and trees in the background.\n",
"Getting embedding for A man sitting in a barber shop.\n",
"Getting embedding for A woman wearing a blue shirt and green hat looks at the camera\n",
"Getting embedding for A man is painting a portrait of an outside scene that includes a street sign with a bicycle chained to it.\n",
"Getting embedding for A man parasails in the choppy water.\n",
"Getting embedding for The man is being stared at.\n",
"Getting embedding for A man is outside.\n",
"Getting embedding for There are a group of people are standing outside.\n",
"Getting embedding for A young man doing a trick on a skateboard down the stairs while being photographed.\n",
"Getting embedding for A person eating.\n",
"Getting embedding for People are looking in a car.\n",
"Getting embedding for A couple holding hands walks down a street.\n",
"Getting embedding for A dog swims towards the camera.\n",
"Getting embedding for Two men cook together with a metal bowl, near a hanging plant.\n",
"Getting embedding for The dogs run and play with a red ball.\n",
"Getting embedding for The women work in the health field.\n",
"Getting embedding for A woman sitting in a laundromat looking at the camera.\n",
"Getting embedding for A woman is lying down.\n",
"Getting embedding for A dog swims in a body of water.\n",
"Getting embedding for A man, wearing revolutionary period clothes, is ringing a bell.\n",
"Getting embedding for People are near snow.\n",
"Getting embedding for The people are all jumping into a body of water.\n",
"Getting embedding for A young woman tries to stick her foot in a fountain.\n",
"Getting embedding for A man is showing a woman something\n",
"Getting embedding for A child is pulling a toy wagon.\n",
"Getting embedding for A white bike is leaning against a post.\n",
"Getting embedding for A woman standing behind a grill outside with a blue basket of food in her hands.\n",
"Getting embedding for A dog drops a red disc on a beach.\n",
"Getting embedding for They are walking with a sign.\n",
"Getting embedding for A vendor trying to attract costumers.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for A boy is in a boat.\n",
"Getting embedding for A woman is sitting at an outdoor dining table.\n",
"Getting embedding for There are soccer players on the field.\n",
"Getting embedding for A little boy swimming underwater with a toy in his hand.\n",
"Getting embedding for A dog is playing in the grass.\n",
"Getting embedding for A large group, wearing pink shirts, waves to onlookers.\n",
"Getting embedding for Two people are having a conversation.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for A good-looking firefighter sets up \"Do Not Cross\" tape in the city.\n",
"Getting embedding for A man in a white shirt hangs a painting in a run down store while other men watch.\n",
"Getting embedding for The women and the man are on a bench.\n",
"Getting embedding for A person on skis on a rail at night.\n",
"Getting embedding for A large group, wearing pink shirts, waves to onlookers.\n",
"Getting embedding for A Little League team tries to catch a runner sliding into a base in an afternoon game.\n",
"Getting embedding for A brown dog running with two white and brown dogs on the seashore with crashing waves behind them.\n",
"Getting embedding for Emergency personnel looking into the back of a car.\n",
"Getting embedding for A young lady is looking at a picture.\n",
"Getting embedding for The toddler is getting a checkup.\n",
"Getting embedding for A team is trying to tag a runner out.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for An average looking man is playing the guitar.\n",
"Getting embedding for A soccer match between a team with white jerseys, and a team with yellow jerseys.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A man walking along side a clean up crew.\n",
"Getting embedding for A biker is doing jumps by trees.\n",
"Getting embedding for A street vendor in Asia tries to bring in more customers.\n",
"Getting embedding for A blond woman with two children is checking out at a Walmart register.\n",
"Getting embedding for There is a family taking a walk outside.\n",
"Getting embedding for man sitting down playing a game of chess alone\n",
"Getting embedding for A man is using his computer while seated at a desk.\n",
"Getting embedding for A view of buildings and people walking across the streets in Times Square, New York City.\n",
"Getting embedding for A lady is kneeling wearing a blue shirt.\n",
"Getting embedding for A couple carrying a child are walking along water.\n",
"Getting embedding for A man in ruffles pushes a stroller through a park.\n",
"Getting embedding for A couple pose in front of a fountain.\n",
"Getting embedding for A human wearing a dress.\n",
"Getting embedding for A little boy drinks milk and gets milk all over his face and table.\n",
"Getting embedding for A young woman tries to stick her foot in a fountain.\n",
"Getting embedding for A woman is sitting at an outdoor dining table.\n",
"Getting embedding for A person is painting.\n",
"Getting embedding for A soccer game played by a blue team and a red team on a perfectly manicured field at dusk.\n",
"Getting embedding for An elderly couple dances next to a table where a boy is sitting with his head down.\n",
"Getting embedding for A person in a red dress is running behind a black animal.\n",
"Getting embedding for A little boy drinks milk and gets milk all over his face and table.\n",
"Getting embedding for A woman with dark hair is wearing a green sweater.\n",
"Getting embedding for A model is doing a shoot.\n",
"Getting embedding for A young man has his head on the table.\n",
"Getting embedding for The young man is waiting with others on the sidewalk.\n",
"Getting embedding for A meeting of young people sitting at a conference table.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for A sign reads \"Welcome to Golden\"\n",
"Getting embedding for Two girls lay next to wooden blocks.\n",
"Getting embedding for A meeting of young people sitting at a conference table.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A couple holding hands walks down a street.\n",
"Getting embedding for A man watches another man.\n",
"Getting embedding for A group sits outside while talking.\n",
"Getting embedding for A human wearing a dress.\n",
"Getting embedding for A man wearing weird clothes is walking through a park.\n",
"Getting embedding for A driver is racing his Ford vehicle on a gravel track.\n",
"Getting embedding for The woman has one foot in the air.\n",
"Getting embedding for A man and woman walk on a street.\n",
"Getting embedding for A person eating.\n",
"Getting embedding for A girl wearing a dress is blowing bubbles at a dock.\n",
"Getting embedding for A smiling man cooks something delicious.\n",
"Getting embedding for A human standing.\n",
"Getting embedding for A brown dog running with two white and brown dogs on the seashore with crashing waves behind them.\n",
"Getting embedding for People are looking in a car.\n",
"Getting embedding for Man riding bike\n",
"Getting embedding for A worker is doing something to a boat.\n",
"Getting embedding for The dogs were outdoors running along the shore\n",
"Getting embedding for A team is trying to tag a runner out.\n",
"Getting embedding for A black-and-white dog carries a stick in his mouth as he swims in the clear water.\n",
"Getting embedding for The man is holding a balloon.\n",
"Getting embedding for A female violinist surrounded by other violinists.\n",
"Getting embedding for A man in a striped polo shirt is pointing and smiling.\n",
"Getting embedding for A group watches a practice.\n",
"Getting embedding for A person is painting.\n",
"Getting embedding for A young man is performing a jump on a skateboard while another young man photographs his stunt.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A woman is near a fountain.\n",
"Getting embedding for A person in a red dress is running behind a black animal.\n",
"Getting embedding for A man hangs a poster in front of a shop.\n",
"Getting embedding for One biker is running with their bike while another is riding around them.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for An elderly couple dances next to a table where a boy is sitting with his head down.\n",
"Getting embedding for Two men are standing outside and snow is on the ground.\n",
"Getting embedding for A woman with dark hair is wearing a green sweater.\n",
"Getting embedding for A man walking along side a clean up crew.\n",
"Getting embedding for A young lady is looking at her camera.\n",
"Getting embedding for A man is painting a landscape of an outdoors area.\n",
"Getting embedding for A man wearing blue jeans and red bowling shoes stands in a bowling alley lane with a green ball in his hand.\n",
"Getting embedding for A small group of church-goers watch a choir practice.\n",
"Getting embedding for A woman is dressed stylishly in native garb.\n",
"Getting embedding for A boat worker securing line.\n",
"Getting embedding for A street vendor in Asia tries to bring in more customers.\n",
"Getting embedding for A family by a van.\n",
"Getting embedding for A man and a woman are walking on a street at the top of a hill.\n",
"Getting embedding for A dog swims in a body of water.\n",
"Getting embedding for 1 little boy wearing a pirate costume following closely behind a little girl wearing a blue dress carrying a orange pumpkin bucket and walking down the sidewalk.\n",
"Getting embedding for Firemen walking outside\n",
"Getting embedding for A young girl dancing in her socks on a wooden floor strewn with pink balloons.\n",
"Getting embedding for A gentleman in a purple scarf and hat is looking at money while holding an accordion.\n",
"Getting embedding for A person in a red dress is running behind a black animal.\n",
"Getting embedding for He is wearing a multicolored shirt\n",
"Getting embedding for A woman is near a fountain.\n",
"Getting embedding for A man and a woman are holding hands as they walk along a city sidewalk.\n",
"Getting embedding for Child with pink strings on head dancing surrounded by confetti, balloons.\n",
"Getting embedding for A couple, who appear to be Indian or Pakistani, walk on a path beside a body of water, the mother carrying a child in a diaper, the father wrapped in a blanket with the logo of the humanitarian organization \"Save the Children.\"\n",
"Getting embedding for A blond woman with two children is checking out at a Walmart register.\n",
"Getting embedding for A family between a van and fence\n",
"Getting embedding for A blond woman with her hair up is taking off a white sweatshirt.\n",
"Getting embedding for A soccer game is happening.\n",
"Getting embedding for A dog is outside playing in the water.\n",
"Getting embedding for A child with a brightly colored shirt plays outside.\n",
"Getting embedding for People are playing soccer.\n",
"Getting embedding for A couple carrying a child are walking along water.\n",
"Getting embedding for A man hangs a poster in front of a shop.\n",
"Getting embedding for A fuzzy white lap dog runs along a rocky beach.\n",
"Getting embedding for Boy in costume followed by a girl in costume.\n",
"Getting embedding for A woman with children.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for A person is cooking.\n",
"Getting embedding for A man dances with a fire baton at night.\n",
"Getting embedding for A person holding a green bowling ball stands by the ball return machine in a bowling alley.\n",
"Getting embedding for A group of people are playing a game of soccer.\n",
"Getting embedding for The skier is wearing a yellow jumpsuit and sliding across a yellow rail.\n",
"Getting embedding for A man is wearing many silver necklaces.\n",
"Getting embedding for Two men are around a bowl.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for Man riding bike\n",
"Getting embedding for A man is outside, near the street.\n",
"Getting embedding for 2 women are carrying little girls.\n",
"Getting embedding for People are skydiving.\n",
"Getting embedding for The Arsenal football club warms-up on the soccer field as a few fans watch.\n",
"Getting embedding for There are lots of cars on the street.\n",
"Getting embedding for A group of tourist waiting for a train at a train station.\n",
"Getting embedding for The school children head home.\n",
"Getting embedding for A group of people sitting around a picnic table.\n",
"Getting embedding for A woman is wearing a red jacket\n",
"Getting embedding for The man knows how to play guitar.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for A boy in a red and blue shirt painting a log.\n",
"Getting embedding for A man being airlifted to safety after being in danger.\n",
"Getting embedding for An old couple dance in by a juke box while a dude wearing shorts sleeps near a table\n",
"Getting embedding for A blond child is pulling a wagon with a little blond boy in it.\n",
"Getting embedding for A skier slides along a metal rail.\n",
"Getting embedding for A group of young men are splashing a lot of water.\n",
"Getting embedding for A groom and a bride are standing on the grass with his hand on her waist.\n",
"Getting embedding for A man is standing on top of a cart.\n",
"Getting embedding for A boy in a tri-colored shirt has his arms out to the side.\n",
"Getting embedding for A town has witnessed the arrival of three bikers.\n",
"Getting embedding for The woman is wearing green.\n",
"Getting embedding for A couple of people are loading brush onto a trailer that is attached to a truck.\n",
"Getting embedding for A woman is filling a suitcase.\n",
"Getting embedding for A woman in a blue shirt and green hat looks up at the camera.\n",
"Getting embedding for A man dances with a fire baton at night.\n",
"Getting embedding for A skater is in the pool.\n",
"Getting embedding for A black-and-white dog carries a stick in his mouth as he swims in the clear water.\n",
"Getting embedding for A young girl dancing in her socks on a wooden floor strewn with pink balloons.\n",
"Getting embedding for A person is near a watermelon.\n",
"Getting embedding for A woman on the side of a street is making food on her cart.\n",
"Getting embedding for A competition is happening.\n",
"Getting embedding for Three guys and a girl are all jumping in a pool together.\n",
"Getting embedding for A bike it outside\n",
"Getting embedding for A man is hanging up a picture of a child.\n",
"Getting embedding for A man walking proudly down the street.\n",
"Getting embedding for A group of people sitting around a picnic table.\n",
"Getting embedding for A row of legs and black boots with a boy sitting at the end of the row.\n",
"Getting embedding for A doctor checks the stomach of a toddler.\n",
"Getting embedding for Stacks of neatly folded clothing cover most of this floor while a woman with a beige shirt and jeans busily fills a suitcase.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for The skier is wearing a yellow jumpsuit and sliding across a yellow rail.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A good-looking firefighter sets up \"Do Not Cross\" tape in the city.\n",
"Getting embedding for A couple, wearing black, burgundy, and white, dance.\n",
"Getting embedding for A dog is swimming.\n",
"Getting embedding for A couple, wearing black, burgundy, and white, dance.\n",
"Getting embedding for A child stoops to pick up a watermelon from a large pile of them.\n",
"Getting embedding for A dog carries a stick in his mouth.\n",
"Getting embedding for Busy Japanese intersection like maybe Tokyo.\n",
"Getting embedding for Three men standing on grass by the water looking at something on a table.\n",
"Getting embedding for A black dog swimming in water near rocks.\n",
"Getting embedding for A baseball player is swinging to hit the ball.\n",
"Getting embedding for A man and two women in black jackets holding umbrellas sit on a long wooden bench.\n",
"Getting embedding for The people are near the table.\n",
"Getting embedding for Children going home from school.\n",
"Getting embedding for A man and woman are walking down the street holding hands.\n",
"Getting embedding for A dog is outside playing in the water.\n",
"Getting embedding for Several people are dancing together in sync.\n",
"Getting embedding for A couple, wearing black, burgundy, and white, dance.\n",
"Getting embedding for A group of tourist waiting for a train at a train station.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A child is pulling a toy wagon.\n",
"Getting embedding for The furry brown dog is swimming in the ocean.\n",
"Getting embedding for The man knows how to play guitar.\n",
"Getting embedding for The furry brown dog is swimming in the ocean.\n",
"Getting embedding for People going for a long walk to the mountains.\n",
"Getting embedding for A boy in a tri-colored shirt has his arms out to the side.\n",
"Getting embedding for A woman is running a marathon in a park.\n",
"Getting embedding for A family between a van and fence\n",
"Getting embedding for A doctor checks the stomach of a toddler.\n",
"Getting embedding for 1 little boy wearing a pirate costume following closely behind a little girl wearing a blue dress carrying a orange pumpkin bucket and walking down the sidewalk.\n",
"Getting embedding for A man carrying a load of fresh direct boxes on car with wheels in the city streets, as a woman walks towards him.\n",
"Getting embedding for A man walking proudly down the street.\n",
"Getting embedding for A couple playing with a little boy on the beach.\n",
"Getting embedding for An elderly couple dance in front of a juke box while a guy in shorts sleeps at a nearby table\n",
"Getting embedding for A person is sitting down.\n",
"Getting embedding for Some humans in a truck\n",
"Getting embedding for A young woman tries to stick her foot in a fountain.\n",
"Getting embedding for Two men are around a bowl.\n",
"Getting embedding for A group of people are playing a game of soccer.\n",
"Getting embedding for A man being photographed while he does a trick on his skateboard down the stairs.\n",
"Getting embedding for A dog is nearby a person\n",
"Getting embedding for Children playing a game in a field.\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt plays outside.\n",
"Getting embedding for Workers are taking a break during midday.\n",
"Getting embedding for A woman in colorful garb with her back to the camera and cloth on her hear.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for A man in a green shirt holds out a clipboard for a woman in pink's attention.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for Two people with bicycles, one in front running with a bike and one in back riding.\n",
"Getting embedding for Two barefoot men are playing on a green lawn outside a building with other people in the background.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for A foreign family is walking along a dirt path next to the water.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for a man wearing a multicolored striped shirt playing the guitar on the street\n",
"Getting embedding for Man wearing black t-shirt sitting at a computer desk.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A man riding a dirt bike\n",
"Getting embedding for A white and brown dog is leaping through the air.\n",
"Getting embedding for A group of people are sitting around a table under a blue sunshade.\n",
"Getting embedding for Woman balancing on edge of fountain while sticking her toe in the water.\n",
"Getting embedding for Overlooking a street with a sign above shops that states Welcome To Golden.\n",
"Getting embedding for A blond man is drinking from a public fountain.\n",
"Getting embedding for Two women who just had lunch hugging and saying goodbye.\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for A man with a red shirt is watching another man who is standing on top of a attached cart filled to the top.\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for The boy in the blue and yellow top is standing with arms outstretched.\n",
"Getting embedding for A person in an orange shirt is laying on the ground while others are standing around her, smiling.\n",
"Getting embedding for A man windsurfs in a wetsuit.\n",
"Getting embedding for A young boy with a blue coat makes a funny face as he walks towards the grass.\n",
"Getting embedding for A woman sitting at a table, taking a picture.\n",
"Getting embedding for A motorcycle racer is in action at the track.\n",
"Getting embedding for The red team knocked the ball toward the goal and the black team tried to block it.\n",
"Getting embedding for A guy performing a bicycle jump trick for an audience.\n",
"Getting embedding for Three firefighters, the nearest firefighter is holding a helmet in his left hand.\n",
"Getting embedding for A small girl stands among many large watermelons.\n",
"Getting embedding for A woman checks her purse while at a outside cafe.\n",
"Getting embedding for Six soccer players on field with player in red uniform in the air and ball airborne.\n",
"Getting embedding for Cheerleaders are doing a cheer at a football field.\n",
"Getting embedding for People sitting down to eat.\n",
"Getting embedding for People are on a stage performing.\n",
"Getting embedding for A woman holding a boombox.\n",
"Getting embedding for Small laughing child with blond-hair sitting at a table holding a green sippy cup.\n",
"Getting embedding for One soccer team is playing against another.\n",
"Getting embedding for An elderly woman places carrots into a casserole.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for Two dogs playfully bite a third dog, which has its tongue sticking out.\n",
"Getting embedding for A large golden dog sniffing the butt of a white dog\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for A Land Rover makes its way through a deep pond.\n",
"Getting embedding for People are stretching on yoga mats.\n",
"Getting embedding for A small boy has gotten into the cabinet and gotten flour and crisco all over himself.\n",
"Getting embedding for a man with a white covering is walking up a flight of stairs.\n",
"Getting embedding for A couple of people working around a pile of rocks.\n",
"Getting embedding for Two children are running down a sidewalk dressed in costumes.\n",
"Getting embedding for A man and two women sitting on a bench.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for Three men, two wearing yellow suits, are looking in the back of a car.\n",
"Getting embedding for Children's soccer game being played while the sun sets in the background.\n",
"Getting embedding for An old man wearing khaki pants and a brown shirt standing on the sidewalk in front of a building.\n",
"Getting embedding for A white horse is pulling a cart while a man stands and watches.\n",
"Getting embedding for A group of people are doing yoga.\n",
"Getting embedding for A woman in a floral dress talks to children in front of a van.\n",
"Getting embedding for Three young children consisting of two girls and a boy who is holding an apple with a bite out of it, are posing on a scenic mountain view background.\n",
"Getting embedding for A man wearing a colorful and striped sweater plays music in the street.\n",
"Getting embedding for Two children play outside in a field.\n",
"Getting embedding for A man in a blue shirt sits outside alone with a chessboard laid out in front of him.\n",
"Getting embedding for a motorcyclist does a nose wheelie.\n",
"Getting embedding for People are fishing and walking next to the water.\n",
"Getting embedding for A man squatting in the foreground of a photograph while taking a photograph of his own of a man doing a skateboarding kick flip in midair above a short flight of stairs outdoors.\n",
"Getting embedding for A group of adults is having a discussion at a table under a tent.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for A woman with a black jacket walks past an outdoor movie poster.\n",
"Getting embedding for People walking around in a big city.\n",
"Getting embedding for People in line for plates of rice.\n",
"Getting embedding for Three men are grouped around the back of a car with its tailgate out, two of the men clothed in yellow uniforms and one in blue.\n",
"Getting embedding for The cowboy waves to the rodeo crowd.\n",
"Getting embedding for two men serving preparing food.\n",
"Getting embedding for A dog zips along the beach.\n",
"Getting embedding for a woman on a yellow shirt is on the floor.\n",
"Getting embedding for many people relax in the yard.\n",
"Getting embedding for A group of people gathers on the grass in a backyard with tents, tables, and chairs set up.\n",
"Getting embedding for A man dressed in snow-gear takes a leap into a snow-covered ravine.\n",
"Getting embedding for A cowboy is riding a bucking bull in a rodeo arena.\n",
"Getting embedding for A man with wild hair rocks a show playing a guitar center stage.\n",
"Getting embedding for Three construction workers posing with construction materials.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for A woman wearing all white and eating, walks next to a man holding a briefcase.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for Five people on stage performing and acting while girl lay's on belly.\n",
"Getting embedding for Two people are next to a fountain with a red bottom and arches of water.\n",
"Getting embedding for a woman with a straw hat working on a strange machine with coconuts at her side.\n",
"Getting embedding for Asian school children sitting on each others shoulders.\n",
"Getting embedding for A crowded street, in an Asian country, where the buildings are dominated by the Seiko building.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for Students practicing yoga in a class setting.\n",
"Getting embedding for Two soccer teams are competing on a soccer field.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for An Asian woman in a blue top and green headscarf smiling widely as another woman rows a boat in the background.\n",
"Getting embedding for Two guys playing football on a campus green.\n",
"Getting embedding for A good-looking firefighter sets up \"Do Not Cross\" tape in the city.\n",
"Getting embedding for Girl is blowing to a butterfly.\n",
"Getting embedding for a child is pushing another kid in a wheeler dressed in a red top and wearing a cap.\n",
"Getting embedding for A classroom of students discussing lecture.\n",
"Getting embedding for A group of people point forwards while performing some kind of act.\n",
"Getting embedding for A white bike is tied to a street sign.\n",
"Getting embedding for A guy stands with a green bowling ball in his hand, and looks down the bowling lane.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A crowded city during daytime.\n",
"Getting embedding for A fireman protects an area by setting up a boundary while others watch.\n",
"Getting embedding for A lady with sunglasses on her head and a green sweatshirt is looking off-camera.\n",
"Getting embedding for A man is running behind a dogsled being pulled by four dogs.\n",
"Getting embedding for A child in formal clothing is walking along the edge of a stony area that is littered in places.\n",
"Getting embedding for Some children are playing jump rope.\n",
"Getting embedding for An older man dressed in blue historical clothing is ringing a bell in his right hand.\n",
"Getting embedding for A young girl sitting at a table with a bowl on her head\n",
"Getting embedding for Brown dog treads through water.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for The silhouette of three people in front of a wall.\n",
"Getting embedding for Three people stand proudly by a truck stocked with building supplies in the street.\n",
"Getting embedding for A farmer fertilizing his garden with manure with a horse and wagon.\n",
"Getting embedding for The school is having a special event in order to show the american culture on how other cultures are dealt with in parties.\n",
"Getting embedding for Young people playing with a long jump rope in the street.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for A saddle bronc rider gets lifted out of the saddle, but keeps his grip during his ride.\n",
"Getting embedding for A boy is jumping on skateboard in the middle of a red bridge.\n",
"Getting embedding for A man doing tricks in the snow.\n",
"Getting embedding for A family with a baby, the father is wearing a save the children sign.\n",
"Getting embedding for A woman preparing to glaze a bowl.\n",
"Getting embedding for The man wearing lots of medals is watching the girl in the yellow bikini top.\n",
"Getting embedding for Three working men smile in front of a truck while holding construction equipment.\n",
"Getting embedding for A boy is drinking out of a water fountain shaped like a woman.\n",
"Getting embedding for A group of people stand on a grassy field.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for A lady wearing a blue print shirt and green head cover smiles for the camera.\n",
"Getting embedding for A seated woman with short hair and a camera throws a Frisbee to a brown dog.\n",
"Getting embedding for A man in a blue shirt is looking up.\n",
"Getting embedding for Three young children consisting of two girls and a boy who is holding an apple with a bite out of it, are posing on a scenic mountain view background.\n",
"Getting embedding for Four guys in wheelchairs on a basketball court two are trying to grab a basketball in midair.\n",
"Getting embedding for A man wearing a tan coat signs papers for another man wearing a blue coat.\n",
"Getting embedding for A black and white dog with a stick in its mouth is swimming.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for A foreign family is walking along a dirt path next to the water.\n",
"Getting embedding for A pirate is chasing a princess down the sidewalk.\n",
"Getting embedding for A doctor in blue scrubs is performing an operation assisted by two men and a woman.\n",
"Getting embedding for A man in shorts and a white garment stands at the base of stairs framed by black railing.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for A man wearing a rice hat is shucking corn using a corn shucker and is surrounded by trees.\n",
"Getting embedding for A person is a red hat and winter jacket is looking into the distance.\n",
"Getting embedding for A man is sleeping on the grass.\n",
"Getting embedding for Street performer in colorful shirt performing with small guitar.\n",
"Getting embedding for Two men trying to build something together, while having fun.\n",
"Getting embedding for A woman and a girl are playing in a field of leaves\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for A lady is on the floor packing a suitcase.\n",
"Getting embedding for A woman in a blue shirt is sitting at a table and looking at her cellphone.\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for Two people pose for the camera.\n",
"Getting embedding for The parents of the younger male are posing for a picture in front of a water fountain.\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for A couple play in the tide with their young son.\n",
"Getting embedding for A foreign family is walking along a dirt path next to the water.\n",
"Getting embedding for A woman in capri jeans crouches on the edge of a fountain with her left foot kicked out to touch the falling water.\n",
"Getting embedding for People waiting to get on a train or just getting off.\n",
"Getting embedding for A couple walk through a white brick town.\n",
"Getting embedding for A car sinking in water.\n",
"Getting embedding for Man on the sidewalk sitting on a motorcycle.\n",
"Getting embedding for Gray dog running down pavement toward laundry line in courtyard.\n",
"Getting embedding for Two people enjoying a water fountain display.\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for A young woman frolicking on the lawn in front of the us capitol building.\n",
"Getting embedding for An older man stands on the sidewalk painting the view.\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for A young toddler wearing pink sandals is walking on hopscotch numbers.\n",
"Getting embedding for Two children in hats play in an open, rocky field.\n",
"Getting embedding for Two large dogs greet other while their owners watch.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for A woman wearing a green and pink dress is dancing with someone wearing a blue top with white pants.\n",
"Getting embedding for Two older men in coats are standing outside.\n",
"Getting embedding for A man wearing a blue shirt screaming or yelling with his arms raised up in the air.\n",
"Getting embedding for A man and a woman cross the street in front of a pizza and gyro restaurant.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A man with a red shirt is watching another man who is standing on top of a attached cart filled to the top.\n",
"Getting embedding for Several men sit outside on brick ledges built around tall trees.\n",
"Getting embedding for A soccer player jumping in air during a game.\n",
"Getting embedding for A man with blond-hair, and a brown shirt drinking out of a public water fountain.\n",
"Getting embedding for A mountain biker jumping a slope outdoors in a forest area.\n",
"Getting embedding for Man in gold pants looking at the camera.\n",
"Getting embedding for A man and a woman having an intimate conversation in front a statue.\n",
"Getting embedding for A man squatting in the foreground of a photograph while taking a photograph of his own of a man doing a skateboarding kick flip in midair above a short flight of stairs outdoors.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for A man squatting in the foreground of a photograph while taking a photograph of his own of a man doing a skateboarding kick flip in midair above a short flight of stairs outdoors.\n",
"Getting embedding for Exhausted looking firemen are walking.\n",
"Getting embedding for A man dressed in warm clothing sleds behind four dogs in the snow.\n",
"Getting embedding for The blond girl is dancing inside a house.\n",
"Getting embedding for A woman wearing an apron inspects a large pot on a table filled with cups, bowls, pots and baskets of assorted size.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for Workers are eating a meal while one man sits on a pile of plywood.\n",
"Getting embedding for Black dog jumping into the air to catch a toy in the snow.\n",
"Getting embedding for Three young children consisting of two girls and a boy who is holding an apple with a bite out of it, are posing on a scenic mountain view background.\n",
"Getting embedding for New sport is being played to show appreciation to the kids who can not walk.\n",
"Getting embedding for A few people in a restaurant setting, one of them is drinking orange juice.\n",
"Getting embedding for A lady in a black and white striped shirt and holding a bouquet of flowers, looks seriously at two gentlemen talking on the steps.\n",
"Getting embedding for A foreign family is walking along a dirt path next to the water.\n",
"Getting embedding for A black dog in snow is jumping off the ground to catch a stick.\n",
"Getting embedding for A little girl picking up a watermelon from a pile.\n",
"Getting embedding for A man wearing black with a gray hat, holding a pitchfork, directs a horse-drawn cart.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for People relax around a large community fountain in a park.\n",
"Getting embedding for One man sits inside and plays the banjo, there are trees behind him outside.\n",
"Getting embedding for A child using a woodworking tool\n",
"Getting embedding for a lone person jumping through the air from one snowy mountain to another.\n",
"Getting embedding for A little boy in a pirate costume is running behind a little girl in a princess costume carrying an orange pumpkin along the sidewalk.\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for A man dressed in blue shirt and shorts sits at a table while playing black in chess.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for A windsurfer is balancing on choppy water.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for A man holds a clipboard and a pen as a woman looks at them.\n",
"Getting embedding for a man wearing blue plays soccer.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A woman playing the violin with sunglasses on her head.\n",
"Getting embedding for Biker riding dirt bike on dirt track\n",
"Getting embedding for A little girl follows two guys with umbrellas down a path.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for A man is sitting on a motorcycle on the sidewalk.\n",
"Getting embedding for A woman talking to four little children outside.\n",
"Getting embedding for A man riding a dirt bike\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A kite surfer begins to fall in the ocean.\n",
"Getting embedding for woman and child on trolley car labeled Powell and market and bay and taylor\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for A man in a kitchen is frying breaded food in a cast iron pan.\n",
"Getting embedding for A man wearing a gray cap is looking down.\n",
"Getting embedding for A woman in blue jeans and a dark jacket walks in front of a building.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for Two people dancing, wearing dance costumes.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for Two blond women are hugging one another.\n",
"Getting embedding for A view of a marketplace full of people in an asian country.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for An elderly man is drinking orange juice at a cafe.\n",
"Getting embedding for A woman in a black and orange jacket throws a stick for a brown and black dog to fetch.\n",
"Getting embedding for A young man in blue sunglasses walking in front of a red brick building.\n",
"Getting embedding for Two women, holding food carryout containers, hug.\n",
"Getting embedding for Lady wearing a yellow top is sitting on a chair\n",
"Getting embedding for A man riding a dirt bike\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A group of people are playing soccer and two players from opposing teams are battling for the ball.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for Two children, in colorful outfits, playing in a field with a big rock in the middle.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for A woman in a red shirt is speaking at a table in a room where three other people are listening to her.\n",
"Getting embedding for A crowded street, in an Asian country, where the buildings are dominated by the Seiko building.\n",
"Getting embedding for A blue excavator digging a large hold in cement.\n",
"Getting embedding for Two women are talking while children are sitting on their laps.\n",
"Getting embedding for A middle-aged oriental woman in a green headscarf and blue shirt is flashing a giant smile.\n",
"Getting embedding for An adult couple enjoys time in a hot tub.\n",
"Getting embedding for A small girl with a necklace is swimming.\n",
"Getting embedding for A boy looks down and spreads his arms wide\n",
"Getting embedding for The blond woman is searching for medical supplies in a suitcase.\n",
"Getting embedding for A man wearing a striped top and jeans does a skateboard trick on some steps while a man who is hunched over photographs him.\n",
"Getting embedding for Soccer players warm up by kicking the soccer ball around while the crowd waits.\n",
"Getting embedding for A man squatting in the foreground of a photograph while taking a photograph of his own of a man doing a skateboarding kick flip in midair above a short flight of stairs outdoors.\n",
"Getting embedding for A young man in a blue blazer and shorts sits alone in front of table with a chess game set up.\n",
"Getting embedding for A man squatting in the foreground of a photograph while taking a photograph of his own of a man doing a skateboarding kick flip in midair above a short flight of stairs outdoors.\n",
"Getting embedding for An old man wearing khaki pants and a brown shirt standing on the sidewalk in front of a building.\n",
"Getting embedding for A crowded street, in an Asian country, where the buildings are dominated by the Seiko building.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for A young man wearing a backpack and dark glasses approaches the brick building where there is a bit of graffiti on the wall.\n",
"Getting embedding for A couple strolls arm and arm and hand in hand down a city sidewalk.\n",
"Getting embedding for A white dog running in the backyard.\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for A man wearing a gray sweater walking through a pile of leaves.\n",
"Getting embedding for A man wearing a multi-color coat is playing the guitar on the street.\n",
"Getting embedding for Men are playing soccer, the one in front is about to kick the ball.\n",
"Getting embedding for A man is leading a Clydesdale up a hay road, within a Old Country.\n",
"Getting embedding for A camera crew is filming two women in formal dresses sitting on a blanket in the middle of a park.\n",
"Getting embedding for A man in a blue jacket screaming.\n",
"Getting embedding for In a bowling alley, a man holding a green bowling ball looks ahead at the pins that he must knock down.\n",
"Getting embedding for Young blond woman putting her foot into a water fountain\n",
"Getting embedding for A man wakeboards on choppy water.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for Some firefighters check a vehicle.\n",
"Getting embedding for Two dogs biting another dog in a field.\n",
"Getting embedding for A small girl dressed in a yellow dress with flowers on it bends over near a large pile of watermelons.\n",
"Getting embedding for A group of people sitting at some sort of gathering.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for Two female medical professionals, one african american& one white, looking over paperwork in a hospital.\n",
"Getting embedding for A picture of a city with a sign welcoming travelers on a busy street.\n",
"Getting embedding for A man with khaki shorts on is holding a little girls hand while she walks in the water of a creek.\n",
"Getting embedding for A man in blue lies on a mostly-barren patch of grass while small groups of people congregate in the distance.\n",
"Getting embedding for A little boy with a blue jacket is making a sour face at the camera.\n",
"Getting embedding for Children smiling and waving at camera\n",
"Getting embedding for A girl wearing a blue shirt, shorts, and sneakers is seated on a stool at a round table, looking at her phone.\n",
"Getting embedding for Outside by the trees, a woman wearing jeans and red jacket throws something for a German shepherd to chase.\n",
"Getting embedding for A person is looking at water jets.\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for A bearded man in a black t-shirt sits in front of a desk holding a computer.\n",
"Getting embedding for A baby is playing with a strand of beads.\n",
"Getting embedding for Two little kids showing their American pride in their star spangled wagon.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for White small child wearing a brown and gray striped hoodie plays at park.\n",
"Getting embedding for A smiling lady in a green jacket at a public gathering.\n",
"Getting embedding for A dog is fetching a stick out of very clear water.\n",
"Getting embedding for A man with facial hair and a red and gray shirt tugging on a piece of rope.\n",
"Getting embedding for An elderly couple, both wearing white shirts, dancing and a young male sitting at a table.\n",
"Getting embedding for An older couple posing in front of a fountain for a picture\n",
"Getting embedding for A man, woman, and child enjoying themselves on a beach.\n",
"Getting embedding for A man stopping on the sidewalk with his bike to have a smoke.\n",
"Getting embedding for A man is standing up holding a green bowling ball with his right hand.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A man in a Tour De Force shirt is working on a bicycle.\n",
"Getting embedding for A group of children play jump rope in the streets while others watch in the background.\n",
"Getting embedding for A woman and a few children in an alleyway in between a vehicle and a fence.\n",
"Getting embedding for A man resting on a street.\n",
"Getting embedding for A mountain biker jumping a slope outdoors in a forest area.\n",
"Getting embedding for A man in the distance is walking past a brick wall painted with words and graffiti.\n",
"Getting embedding for A skier in electric green on the edge of a ramp made of metal bars.\n",
"Getting embedding for A yellow uniformed skier is performing a trick across a railed object.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for A man in a gray vehicle feeding sheep.\n",
"Getting embedding for A group of men and women are having a discussion in a restaurant.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for Man smokes while sitting on a parked scooter.\n",
"Getting embedding for Two ladies are reading through binders.\n",
"Getting embedding for A large golden dog sniffing the butt of a white dog\n",
"Getting embedding for A man is windsurfing.\n",
"Getting embedding for Firefighters are checking a car.\n",
"Getting embedding for A city filled with people in the middle of the daytime.\n",
"Getting embedding for A child.\n",
"Getting embedding for man playing soccer\n",
"Getting embedding for A skier in electric green on the edge of a ramp made of metal bars.\n",
"Getting embedding for A school is hosting an event.\n",
"Getting embedding for A photographer takes a picture of the boy's parents by the fountain.\n",
"Getting embedding for A man is windsurfing.\n",
"Getting embedding for A man is pulling on a rope.\n",
"Getting embedding for A person is looking at water jets.\n",
"Getting embedding for A little boy in a pirate costume is running behind a little girl in a princess costume carrying an orange pumpkin along the sidewalk.\n",
"Getting embedding for People are about to eat.\n",
"Getting embedding for A soccer game.\n",
"Getting embedding for A woman is interacting with a dog.\n",
"Getting embedding for A group of people are sitting around a table under a blue sunshade.\n",
"Getting embedding for A man in a blue shirt sits outside alone with a chessboard laid out in front of him.\n",
"Getting embedding for A woman is talking to children.\n",
"Getting embedding for A guy performing a bicycle jump trick for an audience.\n",
"Getting embedding for A female adult is near some kids.\n",
"Getting embedding for A boy looks down and spreads his arms wide\n",
"Getting embedding for A couple walk through a white brick town.\n",
"Getting embedding for A crowded city during daytime.\n",
"Getting embedding for A group of people gathering on the grass.\n",
"Getting embedding for People are standing on a grassy field\n",
"Getting embedding for A woman in a floral dress talks to children in front of a van.\n",
"Getting embedding for A man photographs a skateboarder doing tricks.\n",
"Getting embedding for A little boy with a blue jacket is making a sour face at the camera.\n",
"Getting embedding for A boy looks down and spreads his arms wide\n",
"Getting embedding for The man is putting up a poster.\n",
"Getting embedding for A family walks along a dirt path.\n",
"Getting embedding for A man is sitting on a motorcycle on the sidewalk.\n",
"Getting embedding for Someone is on top of a cart full of items, while someone else observes.\n",
"Getting embedding for The man rides an animal.\n",
"Getting embedding for A blonde woman looks for things in a suitcase.\n",
"Getting embedding for New sport is being played to show appreciation to the kids who can not walk.\n",
"Getting embedding for A lady with sunglasses on her head and a green sweatshirt is looking off-camera.\n",
"Getting embedding for A few people share a bench.\n",
"Getting embedding for A lady wearing a blue shirt.\n",
"Getting embedding for A group of adults is having a discussion at a table under a tent.\n",
"Getting embedding for The boy mugs for the camera.\n",
"Getting embedding for A guy stands on stage with his guitar.\n",
"Getting embedding for A man waiting with his computer.\n",
"Getting embedding for A few people share a bench.\n",
"Getting embedding for An animal is walking outside.\n",
"Getting embedding for A man in a blue shirt is looking up.\n",
"Getting embedding for A lady with a serious face is standing with two guys in front of steps outside.\n",
"Getting embedding for The old man is standing outside of a building.\n",
"Getting embedding for The blond woman is searching for medical supplies in a suitcase.\n",
"Getting embedding for The diners are at a restaurant.\n",
"Getting embedding for A woman sitting at a table, taking a picture.\n",
"Getting embedding for A dog is catching a stick.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for A fireman protects an area by setting up a boundary while others watch.\n",
"Getting embedding for Four guys in wheelchairs on a basketball court two are trying to grab a basketball in midair.\n",
"Getting embedding for A man with wild hair rocks a show playing a guitar center stage.\n",
"Getting embedding for People are playing a sport.\n",
"Getting embedding for The shop sign says \"Welcome to Golden\"\n",
"Getting embedding for A man and two women sitting on a bench.\n",
"Getting embedding for A man is sleeping on the grass.\n",
"Getting embedding for A cowboy is riding a bucking bull in a rodeo arena.\n",
"Getting embedding for A man is on a dirt bike.\n",
"Getting embedding for A city filled with people in the middle of the daytime.\n",
"Getting embedding for An elderly woman places carrots into a casserole.\n",
"Getting embedding for A man on a street in a bright t-shirt holds some sort of tablet towards a woman in a pink t-shirt and shades.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for A man with a red shirt is watching another man who is standing on top of a attached cart filled to the top.\n",
"Getting embedding for People with bikes.\n",
"Getting embedding for The family is outside.\n",
"Getting embedding for A man squatting in the foreground of a photograph while taking a photograph of his own of a man doing a skateboarding kick flip in midair above a short flight of stairs outdoors.\n",
"Getting embedding for Two dogs biting another dog in a field.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A boy is holding his arms out.\n",
"Getting embedding for A little boy with a blue jacket is making a sour face at the camera.\n",
"Getting embedding for An older man stands on the sidewalk painting the view.\n",
"Getting embedding for A classroom of students discussing lecture.\n",
"Getting embedding for A man is outside.\n",
"Getting embedding for A child using a woodworking tool\n",
"Getting embedding for A fireman is working hard to keep people safe.\n",
"Getting embedding for There are people at work.\n",
"Getting embedding for A seated woman with short hair and a camera throws a Frisbee to a brown dog.\n",
"Getting embedding for Six soccer players on field with player in red uniform in the air and ball airborne.\n",
"Getting embedding for A woman is interacting with a dog.\n",
"Getting embedding for A lady is on the floor packing a suitcase.\n",
"Getting embedding for A person looks up.\n",
"Getting embedding for A male sitting indoors.\n",
"Getting embedding for A man is wakeboarding.\n",
"Getting embedding for A blue excavator digging a large hold in cement.\n",
"Getting embedding for The boy does a skateboarding trick.\n",
"Getting embedding for Children playing a game.\n",
"Getting embedding for A man is making a loud noise.\n",
"Getting embedding for An older couple posing in front of a fountain for a picture\n",
"Getting embedding for Two kids are running.\n",
"Getting embedding for A man is walking with his horse up a country road.\n",
"Getting embedding for A groups of people acts on stage.\n",
"Getting embedding for People wait on traffic.\n",
"Getting embedding for A man in a blue shirt sits outside alone with a chessboard laid out in front of him.\n",
"Getting embedding for A woman checks her purse while at a outside cafe.\n",
"Getting embedding for A few people share a bench.\n",
"Getting embedding for A cowboy rides a bull at a rodeo.\n",
"Getting embedding for A man dressed in warm clothing sleds behind four dogs in the snow.\n",
"Getting embedding for Gray dog running down pavement toward laundry line in courtyard.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A woman in a red shirt is speaking at a table in a room where three other people are listening to her.\n",
"Getting embedding for A white and brown dog is leaping through the air.\n",
"Getting embedding for A man wakeboards on choppy water.\n",
"Getting embedding for A group of adults is having a discussion at a table under a tent.\n",
"Getting embedding for A woman is interacting with a dog.\n",
"Getting embedding for The old man is standing outside of a building.\n",
"Getting embedding for A couple play in the tide with their young son.\n",
"Getting embedding for People are walking outdoors.\n",
"Getting embedding for Three working men smile in front of a truck while holding construction equipment.\n",
"Getting embedding for A decorated man sees a scantily clad female.\n",
"Getting embedding for Children are jumping rope.\n",
"Getting embedding for Two women hug each other.\n",
"Getting embedding for A woman in a floral dress talks to children in front of a van.\n",
"Getting embedding for A woman wearing an apron inspects a large pot on a table filled with cups, bowls, pots and baskets of assorted size.\n",
"Getting embedding for A couple of people working around a pile of rocks.\n",
"Getting embedding for A boy is drinking out of a water fountain shaped like a woman.\n",
"Getting embedding for A man wearing a colorful and striped sweater plays music in the street.\n",
"Getting embedding for A man wearing a tan coat signs papers for another man wearing a blue coat.\n",
"Getting embedding for There are bicyclists stopped at a road.\n",
"Getting embedding for Four guys are playing basketball.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A little girl picking up a watermelon from a pile.\n",
"Getting embedding for A man in a blue jacket screaming.\n",
"Getting embedding for A soccer game is in progress.\n",
"Getting embedding for A young man in a blue blazer and shorts sits alone in front of table with a chess game set up.\n",
"Getting embedding for A woman in blue jeans and a dark jacket walks in front of a building.\n",
"Getting embedding for A kite surfer begins to fall in the ocean.\n",
"Getting embedding for Firefighters are checking a car.\n",
"Getting embedding for A man in a blue shirt sits outside alone with a chessboard laid out in front of him.\n",
"Getting embedding for A woman in white.\n",
"Getting embedding for A couple strolls arm and arm and hand in hand down a city sidewalk.\n",
"Getting embedding for A man is holding a girls hand and walking through a creek.\n",
"Getting embedding for A classroom of students discussing lecture.\n",
"Getting embedding for Several men sit outside on brick ledges built around tall trees.\n",
"Getting embedding for A young toddler wearing pink sandals is walking on hopscotch numbers.\n",
"Getting embedding for There are children present\n",
"Getting embedding for A large golden dog sniffing the butt of a white dog\n",
"Getting embedding for A group of people gathers on the grass in a backyard with tents, tables, and chairs set up.\n",
"Getting embedding for The man is drinking water.\n",
"Getting embedding for A decorated man sees a scantily clad female.\n",
"Getting embedding for A man windsurfs in a wetsuit.\n",
"Getting embedding for A female adult is near some kids.\n",
"Getting embedding for The bikers are in the town.\n",
"Getting embedding for A baby walks on the ground.\n",
"Getting embedding for A man is working on a bike.\n",
"Getting embedding for A view of a marketplace full of people in an asian country.\n",
"Getting embedding for A man is bowling.\n",
"Getting embedding for The red and black team are playing a game.\n",
"Getting embedding for Children smiling and waving at camera\n",
"Getting embedding for A group of people point forwards while doing something.\n",
"Getting embedding for A woman in blue jeans and a dark jacket walks in front of a building.\n",
"Getting embedding for A smiling lady in a green jacket at a public gathering.\n",
"Getting embedding for A car is flooding.\n",
"Getting embedding for A man wearing a gray sweater walking through a pile of leaves.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for A man is sitting on a motorcycle.\n",
"Getting embedding for A man wants a woman to look at his clipboard\n",
"Getting embedding for A young woman frolicking on the lawn in front of the us capitol building.\n",
"Getting embedding for Two dogs playfully bite a third dog, which has its tongue sticking out.\n",
"Getting embedding for A man doing a wheelie\n",
"Getting embedding for An animal is walking outside.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for Small laughing child with blond-hair sitting at a table holding a green sippy cup.\n",
"Getting embedding for A boy is holding his arms out.\n",
"Getting embedding for A couple are having a conversation\n",
"Getting embedding for Pirate on the sidewalk\n",
"Getting embedding for A man pulls on a rope.\n",
"Getting embedding for Black dog jumping into the air to catch a toy in the snow.\n",
"Getting embedding for A person is dipping her foot into water.\n",
"Getting embedding for A dog zips along the beach.\n",
"Getting embedding for A person on a bike is near a street.\n",
"Getting embedding for Four guys in wheelchairs on a basketball court two are trying to grab a basketball in midair.\n",
"Getting embedding for The young man has glasses on his face.\n",
"Getting embedding for A man is sitting on a motorcycle on the sidewalk.\n",
"Getting embedding for A soccer game.\n",
"Getting embedding for A small girl with a necklace is swimming.\n",
"Getting embedding for Four guys are playing basketball.\n",
"Getting embedding for A man is pulling on a rope.\n",
"Getting embedding for A boy in multi-colored shirt hold his arms out from his sides\n",
"Getting embedding for A woman walking outside.\n",
"Getting embedding for A group of people are doing yoga.\n",
"Getting embedding for A woman is talking to children.\n",
"Getting embedding for A person looks in her purse at a restaurant.\n",
"Getting embedding for People walking around in a big city.\n",
"Getting embedding for People are playing a sport in honor of crippled people.\n",
"Getting embedding for A pirate is chasing a princess down the sidewalk.\n",
"Getting embedding for A foreign family walks by a dirt trail along a body of water.\n",
"Getting embedding for A crowded city during daytime.\n",
"Getting embedding for A woman in colorful garb with her back to the camera and cloth on her hear.\n",
"Getting embedding for A little girl follows two guys with umbrellas down a path.\n",
"Getting embedding for Asian school children sitting on each others shoulders.\n",
"Getting embedding for A woman in capri jeans crouches on the edge of a fountain with her left foot kicked out to touch the falling water.\n",
"Getting embedding for The toddler has milk around the corners of his mouth.\n",
"Getting embedding for Three men are grouped around the back of a car.\n",
"Getting embedding for A small boy has gotten into the cabinet and gotten flour and crisco all over himself.\n",
"Getting embedding for More than one person on a bicycle is obeying traffic laws.\n",
"Getting embedding for Lady wearing a yellow top is sitting on a chair\n",
"Getting embedding for There are some guys in this picture\n",
"Getting embedding for Firefighters are checking a car.\n",
"Getting embedding for A boy is drinking out of a water fountain shaped like a woman.\n",
"Getting embedding for A man in a green shirt holds out a clipboard for a woman in pink's attention.\n",
"Getting embedding for Two men are barefoot on the lawn.\n",
"Getting embedding for A woman is looking at a man's possessions\n",
"Getting embedding for Two ladies are reading through binders.\n",
"Getting embedding for A couple play in the tide with their young son.\n",
"Getting embedding for A little girl picking up a watermelon from a pile.\n",
"Getting embedding for People waiting to get on a train or just getting off.\n",
"Getting embedding for A man is taking the picture of a skateboarder who is performing a trick.\n",
"Getting embedding for A family with a baby, the father is wearing a save the children sign.\n",
"Getting embedding for There are scultupres nearby.\n",
"Getting embedding for A soccer player jumping up while a game is in progess.\n",
"Getting embedding for A man in the distance is walking past a brick wall painted with words and graffiti.\n",
"Getting embedding for A doctor in blue scrubs is performing an operation assisted by two men and a woman.\n",
"Getting embedding for A small girl dressed in a yellow dress with flowers on it bends over near a large pile of watermelons.\n",
"Getting embedding for A doctor in blue scrubs is performing an operation assisted by two men and a woman.\n",
"Getting embedding for A lady in a black and white striped shirt and holding a bouquet of flowers, looks seriously at two gentlemen talking on the steps.\n",
"Getting embedding for People are about to eat.\n",
"Getting embedding for A small girl with a necklace is in the water\n",
"Getting embedding for A woman playing the violin with sunglasses on her head.\n",
"Getting embedding for A group of men and women are having a discussion in a restaurant.\n",
"Getting embedding for A man wearing a multi-color coat is playing the guitar on the street.\n",
"Getting embedding for The white and brown dog is in the air.\n",
"Getting embedding for A lady wearing a blue print shirt and green head cover smiles for the camera.\n",
"Getting embedding for A group of people are doing yoga.\n",
"Getting embedding for A group of people stand on a grassy field.\n",
"Getting embedding for A couple play in the tide with their young son.\n",
"Getting embedding for A boy in a shirt plays outside.\n",
"Getting embedding for Man on the sidewalk sitting on a motorcycle.\n",
"Getting embedding for Somebody is engaging in winter sports.\n",
"Getting embedding for A windsurfer is balancing on choppy water.\n",
"Getting embedding for A man is outside.\n",
"Getting embedding for A man wearing a blue shirt screaming or yelling with his arms raised up in the air.\n",
"Getting embedding for A lady with a serious face is standing with two guys in front of steps outside.\n",
"Getting embedding for A foreign family walks by a dirt trail along a body of water.\n",
"Getting embedding for People are waiting to eat.\n",
"Getting embedding for A woman in capri jeans crouches on the edge of a fountain with her left foot kicked out to touch the falling water.\n",
"Getting embedding for A couple walk through a white brick town.\n",
"Getting embedding for Overlooking a street with a sign above shops that states Welcome To Golden.\n",
"Getting embedding for A cart is full of items.\n",
"Getting embedding for A blond man drinking water from a fountain.\n",
"Getting embedding for A good-looking firefighter sets up \"Do Not Cross\" tape in the city.\n",
"Getting embedding for The person is interested in a water jet.\n",
"Getting embedding for A man in a blue shirt sits outside alone with a chessboard laid out in front of him.\n",
"Getting embedding for She is packing.\n",
"Getting embedding for A lady wearing a blue print shirt and green head cover smiles for the camera.\n",
"Getting embedding for A man doing a wheelie\n",
"Getting embedding for A man is on a dirt bike.\n",
"Getting embedding for The boy is young.\n",
"Getting embedding for A man wearing a multi-color coat is playing the guitar on the street.\n",
"Getting embedding for A female adult is near some kids.\n",
"Getting embedding for A man is leading a Clydesdale up a hay road, within a Old Country.\n",
"Getting embedding for A couple of people working around a pile of rocks.\n",
"Getting embedding for A man and two women sitting on a bench.\n",
"Getting embedding for A man is wearing a cap\n",
"Getting embedding for Girls and boys are having fun outdoors\n",
"Getting embedding for A group of children play jump rope in the streets while others watch in the background.\n",
"Getting embedding for The people stretched on yoga mats.\n",
"Getting embedding for A woman is at a machine.\n",
"Getting embedding for A child.\n",
"Getting embedding for Children's soccer game being played while the sun sets in the background.\n",
"Getting embedding for People take photos outdoors while a man performs exciting skateboarding tricks.\n",
"Getting embedding for A dog with an object in it's mouth is in the water.\n",
"Getting embedding for A little girl follows two guys with umbrellas down a path.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A couple strolls arm and arm and hand in hand down a city sidewalk.\n",
"Getting embedding for A white horse is pulling a cart while a man stands and watches.\n",
"Getting embedding for A Land Rover makes its way through a deep pond.\n",
"Getting embedding for A man wearing a blue shirt screaming or yelling with his arms raised up in the air.\n",
"Getting embedding for A man is standing in front of a shop.\n",
"Getting embedding for A motorcycle racer is in action at the track.\n",
"Getting embedding for A lady is on the floor packing a suitcase.\n",
"Getting embedding for They are avoiding trees.\n",
"Getting embedding for An old man is enjoying a beverage at a cafe.\n",
"Getting embedding for A family of three is at the beach.\n",
"Getting embedding for A woman checks her purse while at a outside cafe.\n",
"Getting embedding for A boy is holding his arms out.\n",
"Getting embedding for Cheerleaders cheer on a field for an activity.\n",
"Getting embedding for A guy is driving a dirt bike.\n",
"Getting embedding for Bicyclists waiting their turn to cross.\n",
"Getting embedding for A person throwing something for her dog.\n",
"Getting embedding for A male is getting a drink of water.\n",
"Getting embedding for There are women showing affection.\n",
"Getting embedding for A man sits in front of a set up chess game.\n",
"Getting embedding for A doctor in blue scrubs is performing an operation assisted by two men and a woman.\n",
"Getting embedding for The people are outdoors.\n",
"Getting embedding for A group of people stand on a grassy field.\n",
"Getting embedding for A man wakeboards on choppy water.\n",
"Getting embedding for A man with khaki shorts on is holding a little girls hand while she walks in the water of a creek.\n",
"Getting embedding for A boy is jumping on skateboard in the middle of a red bridge.\n",
"Getting embedding for A crowded street, in an Asian country, where the buildings are dominated by the Seiko building.\n",
"Getting embedding for A man is playing a game\n",
"Getting embedding for A little girl follows two guys with umbrellas down a path.\n",
"Getting embedding for A picture of a city is on a street\n",
"Getting embedding for A woman preparing to glaze\n",
"Getting embedding for A man dressed in blue shirt and shorts sits at a table while playing black in chess.\n",
"Getting embedding for A man pulls on a rope.\n",
"Getting embedding for A lady with sunglasses on her head and a green sweatshirt is looking off-camera.\n",
"Getting embedding for A man wearing a gray cap is looking down.\n",
"Getting embedding for A good-looking firefighter sets up \"Do Not Cross\" tape in the city.\n",
"Getting embedding for A man is pulling on a rope.\n",
"Getting embedding for A little boy with a blue jacket is making a sour face at the camera.\n",
"Getting embedding for A young man wearing a backpack and dark glasses approaches the brick building where there is a bit of graffiti on the wall.\n",
"Getting embedding for People sitting down to eat.\n",
"Getting embedding for A saddle bronc rider gets lifted out of the saddle, but keeps his grip during his ride.\n",
"Getting embedding for A baby is playing with a strand of beads.\n",
"Getting embedding for A lady in a black and white striped shirt and holding a bouquet of flowers, looks seriously at two gentlemen talking on the steps.\n",
"Getting embedding for Biker riding dirt bike on dirt track\n",
"Getting embedding for A boy is drinking out of a water fountain shaped like a woman.\n",
"Getting embedding for Four guys in wheelchairs on a basketball court two are trying to grab a basketball in midair.\n",
"Getting embedding for A man shows a woman something.\n",
"Getting embedding for A man with khaki shorts on is holding a little girls hand while she walks in the water of a creek.\n",
"Getting embedding for Three working men smile in front of a truck while holding construction equipment.\n",
"Getting embedding for A person is dipping her foot into water.\n",
"Getting embedding for People are standing on a grassy field\n",
"Getting embedding for A decorated man sees a scantily clad female.\n",
"Getting embedding for A woman in a floral dress talks to children in front of a van.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for A woman in blue jeans and a dark jacket walks in front of a building.\n",
"Getting embedding for People are playing a sport.\n",
"Getting embedding for A log is being painted by a child.\n",
"Getting embedding for A blond man is drinking from a public fountain.\n",
"Getting embedding for A boy is wearing a shirt\n",
"Getting embedding for A young lady playing in front of the capitol building.\n",
"Getting embedding for A male is getting a drink of water.\n",
"Getting embedding for A man photographs a skateboarder doing tricks.\n",
"Getting embedding for A dog zips along the beach.\n",
"Getting embedding for A boy in a blue, yellow, and orange shirt holding his arms out from his sides.\n",
"Getting embedding for A man is running behind a dogsled being pulled by four dogs.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for A white dog running in the backyard.\n",
"Getting embedding for An old man is standing by a building in downtown.\n",
"Getting embedding for Three wheelchair basketball players playing basketball in field.\n",
"Getting embedding for A bearded man is pulling on a rope.\n",
"Getting embedding for A person in a red dress is running behind a black animal.\n",
"Getting embedding for A man is hanging up a picture of a child.\n",
"Getting embedding for The city has a lot of people in it.\n",
"Getting embedding for People are riding their bicycles.\n",
"Getting embedding for People are looking in a car.\n",
"Getting embedding for A woman and an animal are interacting outdoors.\n",
"Getting embedding for The cart is filled to the top.\n",
"Getting embedding for Women are waiting by a tram.\n",
"Getting embedding for The two people are dancing.\n",
"Getting embedding for three dogs are outside\n",
"Getting embedding for 2 women are carrying little girls.\n",
"Getting embedding for Somebody is hanging up pictures while people watch.\n",
"Getting embedding for A man is being bucked on a horse.\n",
"Getting embedding for A man and a woman are holding hands.\n",
"Getting embedding for A young woman packs belongings into a black luggage carrier.\n",
"Getting embedding for A man watches another man.\n",
"Getting embedding for A sign reads \"Welcome to Golden\"\n",
"Getting embedding for 3 people are walking along a path.\n",
"Getting embedding for The woman is eating a banana.\n",
"Getting embedding for A couple are playing with a young child outside.\n",
"Getting embedding for A child in a multicolored shirt is painting a log\n",
"Getting embedding for A man in a gold skirt sitting at his computer watching the computer screen.\n",
"Getting embedding for Soccer teams are competing outdoors.\n",
"Getting embedding for Three kids are sitting on a rock.\n",
"Getting embedding for A man stands.\n",
"Getting embedding for The mothers are having conversations.\n",
"Getting embedding for the workers are waiting for next work\n",
"Getting embedding for The three young childeren were hold an apple with a bite on it\n",
"Getting embedding for A group watches a practice.\n",
"Getting embedding for A family by a van.\n",
"Getting embedding for The two men sign something.\n",
"Getting embedding for A woman gazes skyward\n",
"Getting embedding for A woman is there.\n",
"Getting embedding for The dogs were outdoors running along the shore\n",
"Getting embedding for A competition is happening.\n",
"Getting embedding for There's a biker\n",
"Getting embedding for A football/soccer club during warm ups.\n",
"Getting embedding for The man is running.\n",
"Getting embedding for There are three girls\n",
"Getting embedding for A woman laughs while another paddles a boat.\n",
"Getting embedding for A soccer game is happening.\n",
"Getting embedding for people dance together\n",
"Getting embedding for A man is sitting down.\n",
"Getting embedding for Choir singing in mass.\n",
"Getting embedding for A Ford is being driven on a track.\n",
"Getting embedding for Bikers stop for gas\n",
"Getting embedding for A skier is near the rail.\n",
"Getting embedding for A firefighter sets up a do not cross\n",
"Getting embedding for Three children pose for a picture.\n",
"Getting embedding for A dog swims in a body of water.\n",
"Getting embedding for There is a farmer with a horse wagon\n",
"Getting embedding for Men are indoors.\n",
"Getting embedding for The women work in the health field.\n",
"Getting embedding for People are paying attention to the person hanging pictures\n",
"Getting embedding for The dogs are playing with a ball.\n",
"Getting embedding for Two people in wheelchairs are reaching in the air for a basketball.\n",
"Getting embedding for Men are near the water.\n",
"Getting embedding for The people are all jumping into a body of water.\n",
"Getting embedding for There are people in ChinaTown.\n",
"Getting embedding for A lady at a table takes pictures.\n",
"Getting embedding for The skier is wearing a jumpsuit.\n",
"Getting embedding for A woman is shown a tablet by a man standing on the street.\n",
"Getting embedding for A child has milk on their face.\n",
"Getting embedding for A guy playing a board game by himself.\n",
"Getting embedding for A group of young men are splashing a lot of water.\n",
"Getting embedding for A girl is running on the trail.\n",
"Getting embedding for Uniformed men work.\n",
"Getting embedding for A child is pulling a toy wagon.\n",
"Getting embedding for There are lots of cars on the street.\n",
"Getting embedding for A model is doing a shoot.\n",
"Getting embedding for Boy in costume followed by a girl in costume.\n",
"Getting embedding for People wait for a race to begin.\n",
"Getting embedding for a bunch of people are playing soccer\n",
"Getting embedding for Four people stand near a wall speaking to each other.\n",
"Getting embedding for Men are hanging something on the wall.\n",
"Getting embedding for People lounge about a pool.\n",
"Getting embedding for A boy is in a boat.\n",
"Getting embedding for A woman has children with her at the check out counter.\n",
"Getting embedding for A group of people are possing for an add.\n",
"Getting embedding for a sports game is being played\n",
"Getting embedding for A groom and bride are standing outside.\n",
"Getting embedding for A motorcyclist doing a wheelie\n",
"Getting embedding for A man is standing on top of a cart.\n",
"Getting embedding for A girl is holding a glass.\n",
"Getting embedding for A group of men are hanging a picture on a wall.\n",
"Getting embedding for A woman standing behind a grill outside with a blue basket of food in her hands.\n",
"Getting embedding for Some humans in a truck\n",
"Getting embedding for A pregnant lady shares the sounds of her pregnancy.\n",
"Getting embedding for A man stands in a bowling alley lane.\n",
"Getting embedding for Women are with their kids\n",
"Getting embedding for A girl is enjoying herself.\n",
"Getting embedding for A climber is ascending\n",
"Getting embedding for A man is hanging a picture of a child.\n",
"Getting embedding for She plays in a band.\n",
"Getting embedding for Near a hanging plant, a metal bowl was used to cook by two mens\n",
"Getting embedding for People are looking at sculptures at a museum.\n",
"Getting embedding for A dog runs across the beach.\n",
"Getting embedding for Three men are smiling\n",
"Getting embedding for a dog drops a red disc\n",
"Getting embedding for A human wearing a dress.\n",
"Getting embedding for Small boy in pool holds toy.\n",
"Getting embedding for The people are near the table.\n",
"Getting embedding for A person is outdoors, on a horse.\n",
"Getting embedding for People are on a bench\n",
"Getting embedding for Two men are around a bowl.\n",
"Getting embedding for Three boys are in a body of water.\n",
"Getting embedding for The school children head home.\n",
"Getting embedding for people are dancing.\n",
"Getting embedding for A group sits outside while talking.\n",
"Getting embedding for He is wearing a multicolored shirt\n",
"Getting embedding for People are on their bikes.\n",
"Getting embedding for Two sumo wrestlers compete in a match.\n",
"Getting embedding for A young lady is looking at a picture.\n",
"Getting embedding for The woman is wearing white.\n",
"Getting embedding for Two people are next to each other.\n",
"Getting embedding for A bike it outside\n",
"Getting embedding for The adults are both male and female.\n",
"Getting embedding for A man is drinking juice.\n",
"Getting embedding for People are skydiving.\n",
"Getting embedding for A dog is outside playing in the water.\n",
"Getting embedding for There is a city.\n",
"Getting embedding for People are waiting for a race.\n",
"Getting embedding for A woman eats a banana and walks across a street, and there is a man trailing behind her.\n",
"Getting embedding for The woman is outdoors with a machine.\n",
"Getting embedding for Two men are in the kitchen.\n",
"Getting embedding for There is snow on the ground.\n",
"Getting embedding for The child is outdoors in his bright colored shirt.\n",
"Getting embedding for The people are on bikes.\n",
"Getting embedding for The people are holding onto the rail.\n",
"Getting embedding for A man is holding an accordian.\n",
"Getting embedding for A child plays on a playground.\n",
"Getting embedding for A boy painting a log.\n",
"Getting embedding for An old couple dance in by a juke box while a dude wearing shorts sleeps near a table\n",
"Getting embedding for The women and the man are on a bench.\n",
"Getting embedding for A man is wearing many silver necklaces.\n",
"Getting embedding for Someone is browsing jewelry.\n",
"Getting embedding for A town has witnessed the arrival of three bikers.\n",
"Getting embedding for The woman is wearing green.\n",
"Getting embedding for A baseball player is swinging to hit the ball.\n",
"Getting embedding for There are two men outside in this picture\n",
"Getting embedding for A man sits at a desk.\n",
"Getting embedding for Firemen walking outside\n",
"Getting embedding for A violinist among other string musicians\n",
"Getting embedding for There are soccer players on the field.\n",
"Getting embedding for A man is outside, near the street.\n",
"Getting embedding for Men watching motorcyclist.\n",
"Getting embedding for A couple is holding a child\n",
"Getting embedding for There are people on a sidewalk.\n",
"Getting embedding for There is a woman in a room.\n",
"Getting embedding for Three men are outside.\n",
"Getting embedding for The man is being stared at.\n",
"Getting embedding for A little boy is underwater.\n",
"Getting embedding for The bicycles are on a road.\n",
"Getting embedding for A man is playing the guitar.\n",
"Getting embedding for Two people walk away from a restaurant across a street.\n",
"Getting embedding for A man being photographed while he does a trick on his skateboard down the stairs.\n",
"Getting embedding for There are people in an urban area.\n",
"Getting embedding for A woman is outside.\n",
"Getting embedding for Man standing with three men in army uniform\n",
"Getting embedding for Two people are walking down a path.\n",
"Getting embedding for A woman is inside.\n",
"Getting embedding for one person sits while three stand near a body of water.\n",
"Getting embedding for A woman with children.\n",
"Getting embedding for A woman talks to two other women.\n",
"Getting embedding for A man and a woman are outside.\n",
"Getting embedding for An couple is outside.\n",
"Getting embedding for A boy walking away.\n",
"Getting embedding for There are people just getting on a train\n",
"Getting embedding for The man and woman are outdoors.\n",
"Getting embedding for A crowded street in Asia.\n",
"Getting embedding for A worker is doing something to a boat.\n",
"Getting embedding for there was a speed breaker on the road by which people are taking care\n",
"Getting embedding for The man walked alongside the crew.\n",
"Getting embedding for A woman is blonde\n",
"Getting embedding for A couple is talking.\n",
"Getting embedding for A crowd gesticulates.\n",
"Getting embedding for The chorus is singing.\n",
"Getting embedding for There is a little boy in brown pants.\n",
"Getting embedding for They are walking with a sign.\n",
"Getting embedding for The person skiis\n",
"Getting embedding for A man is wearing a shirt.\n",
"Getting embedding for someone in his twenties kicks at the ground\n",
"Getting embedding for Someone is making music.\n",
"Getting embedding for The man is pushing a stroller.\n",
"Getting embedding for A woman throws something for a dog.\n",
"Getting embedding for There is an individual waiting indoors.\n",
"Getting embedding for The person is surfing.\n",
"Getting embedding for A child swimming.\n",
"Getting embedding for An oddly dressed man walking.\n",
"Getting embedding for some car passing outside\n",
"Getting embedding for A child is dancing.\n",
"Getting embedding for The woman is wearing a coat.\n",
"Getting embedding for A man is outside.\n",
"Getting embedding for The man knows how to play guitar.\n",
"Getting embedding for A woman is near a fountain.\n",
"Getting embedding for A young lady is looking at her camera.\n",
"Getting embedding for A person eating.\n",
"Getting embedding for the dog is in the water\n",
"Getting embedding for A boy in a tri-colored shirt has his arms out to the side.\n",
"Getting embedding for A person is performing.\n",
"Getting embedding for the kid has milk on his face\n",
"Getting embedding for A woman with a green sweater has a happy expression.\n",
"Getting embedding for An animal is jumping in a place that is not hot.\n",
"Getting embedding for A person is cooking.\n",
"Getting embedding for The woman has one foot in the air.\n",
"Getting embedding for The man is cooking\n",
"Getting embedding for A woman is dressed stylishly in native garb.\n",
"Getting embedding for Two people are having a conversation.\n",
"Getting embedding for there are three dogs\n",
"Getting embedding for a man and woman are getting intimate\n",
"Getting embedding for A dog is playing in the grass.\n",
"Getting embedding for A person is painting.\n",
"Getting embedding for Two girls are playing outside.\n",
"Getting embedding for couple walking\n",
"Getting embedding for Some people are cheering on a field.\n",
"Getting embedding for The young violinist is a woman.\n",
"Getting embedding for A man is playing an instrument.\n",
"Getting embedding for A group of tourist waiting for a train at a train station.\n",
"Getting embedding for The two girls are outside.\n",
"Getting embedding for Thre are girls.\n",
"Getting embedding for A dog is swimming.\n",
"Getting embedding for A man is showing a woman something\n",
"Getting embedding for A girl is dancing in a brown shirt and blue jean skirt on a wooden floor\n",
"Getting embedding for A person holding a green bowling ball stands by the ball return machine in a bowling alley.\n",
"Getting embedding for A team is trying to tag a runner out.\n",
"Getting embedding for A boy sits at peoples feet.\n",
"Getting embedding for A couple of people are loading brush onto a trailer that is attached to a truck.\n",
"Getting embedding for the woman is outside\n",
"Getting embedding for The three men are outside.\n",
"Getting embedding for A man wearing weird clothes is walking through a park.\n",
"Getting embedding for There are a group of people are standing outside.\n",
"Getting embedding for A man is painting a landscape of an outdoors area.\n",
"Getting embedding for A skater is in the pool.\n",
"Getting embedding for A dog swims towards the camera.\n",
"Getting embedding for a player has a penalty kick\n",
"Getting embedding for There are people sitting down.\n",
"Getting embedding for The men are drinking.\n",
"Getting embedding for Someone is on a phone.\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for A mother is with her two children at walmart\n",
"Getting embedding for The women are exercising.\n",
"Getting embedding for There are people on bicycles.\n",
"Getting embedding for The man seems happy\n",
"Getting embedding for A person rides a bike outdoors.\n",
"Getting embedding for A dog on the beach.\n",
"Getting embedding for A man and a woman are standing.\n",
"Getting embedding for A young man is sitting.\n",
"Getting embedding for Cheerleaders cheering.\n",
"Getting embedding for Two adults walk across the street.\n",
"Getting embedding for People are near snow.\n",
"Getting embedding for Two people standing standing near a large statue, with other states nearby.\n",
"Getting embedding for People have bikes\n",
"Getting embedding for A woman wearing a blue shirt and green hat looks at the camera\n",
"Getting embedding for A man walks near a store\n",
"Getting embedding for The man is holding a balloon.\n",
"Getting embedding for A person is near a watermelon.\n",
"Getting embedding for A person is sitting down.\n",
"Getting embedding for City people in street\n",
"Getting embedding for A man, wearing a revolutionary apparel is making noise with a bell.\n",
"Getting embedding for Someone is toting packages in an urban setting.\n",
"Getting embedding for The water was choppy as the man parasailed.\n",
"Getting embedding for There is a crowded street.\n",
"Getting embedding for There is a family taking a walk outside.\n",
"Getting embedding for Two Girls are doing exercise.\n",
"Getting embedding for A dog is outside\n",
"Getting embedding for A woman is filling a suitcase.\n",
"Getting embedding for A woman is in Walmart\n",
"Getting embedding for The toddler is getting a checkup.\n",
"Getting embedding for The couple danced.\n",
"Getting embedding for there is a group of people waiting outside\n",
"Getting embedding for A vendor trying to attract costumers.\n",
"Getting embedding for A man is being moved.\n",
"Getting embedding for A man and woman walk on a street.\n",
"Getting embedding for A dog is in the water.\n",
"Getting embedding for There's a group of men hanging up a picture.\n",
"Getting embedding for Man riding bike\n",
"Getting embedding for Two people are seated together.\n",
"Getting embedding for A dog is running outside.\n",
"Getting embedding for a woman in an office building talks to two other women and a man with notepads\n",
"Getting embedding for Two boys are skateboarding outside.\n",
"Getting embedding for There are two people in this picture\n",
"Getting embedding for Bikers stop and wait for traffic at the intersection.\n",
"Getting embedding for A human is riding a vehicle.\n",
"Getting embedding for The cars are outside.\n",
"Getting embedding for there is a bmx biker who is perfect in ravine jumping\n",
"Getting embedding for A couple carrying a child are walking along water.\n",
"Getting embedding for J.P. Morgan Chase held a track event.\n",
"Getting embedding for a man is looking at his webcam\n",
"Getting embedding for Two girls lay next to wooden blocks.\n",
"Getting embedding for boy walks accross patio\n",
"Getting embedding for A man is wearing a bright green shirt\n",
"Getting embedding for A dog carries a stick in his mouth.\n",
"Getting embedding for A barber is at work.\n",
"Getting embedding for girl throws stones\n",
"Getting embedding for The woman is outdoors.\n",
"Getting embedding for The child had milk all over his face.\n",
"Getting embedding for A human standing.\n",
"Getting embedding for The men are working.\n",
"Getting embedding for Two guys cook using some rice milk.\n",
"Getting embedding for young people are gathered around a table\n",
"Getting embedding for Two people dancing outdoors.\n",
"Getting embedding for A dog is nearby a person\n",
"Getting embedding for There are at least four people.\n",
"Getting embedding for A man hangs a poster in front of a shop.\n",
"Getting embedding for People are playing soccer.\n",
"Getting embedding for A building is being painted.\n",
"Getting embedding for group of people running\n",
"Getting embedding for The Asian city is full of sights.\n",
"Getting embedding for Pitcher is winding up a throw\n",
"Getting embedding for there are some groups one of them of young females are in dresses strolling for a ramp walk\n",
"Getting embedding for A woman talks to others indoors.\n",
"Getting embedding for There is a lot of fruit.\n",
"Getting embedding for The people are walking outdoors.\n",
"Getting embedding for Humans playing jump rope\n",
"Getting embedding for An artist is sculpting with clay.\n",
"Getting embedding for A group of people are playing a game of soccer.\n",
"Getting embedding for People standing near sculptures\n",
"Getting embedding for A biker is doing jumps by trees.\n",
"Getting embedding for A young man has his head on the table.\n",
"Getting embedding for The people are listening to a speaker.\n",
"Getting embedding for A lady is close to the floor.\n",
"Getting embedding for People are holding hands and walking.\n",
"Getting embedding for Three puppies are playing outside.\n",
"Getting embedding for A woman is lying down.\n",
"Getting embedding for A woman is wearing a red jacket\n",
"Getting embedding for the man is dancing\n",
"Getting embedding for A man stares at a passing couple.\n",
"Getting embedding for Two old men in winter coats talk outside.\n",
"Getting embedding for A couple of people are holding hands while walking.\n",
"Getting embedding for A man is decorating a shop.\n",
"Getting embedding for A man is advertising for a restaurant.\n",
"Getting embedding for Woman wearing a shirt sewing.\n",
"Getting embedding for Busy ChinaTown street corner where people are walking past an open front store.\n",
"Getting embedding for A man walking proudly down the street.\n",
"Getting embedding for A town has witnessed the arrival of three bikers.\n",
"Getting embedding for a young man wearing a backpack and sunglasses is walking towards a shopping area.\n",
"Getting embedding for A man sitting in a barber shop.\n",
"Getting embedding for A man, wearing revolutionary period clothes, is ringing a bell.\n",
"Getting embedding for Humans playing jump rope\n",
"Getting embedding for A woman talks to others indoors.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for Three boys are in a body of water.\n",
"Getting embedding for Woman with green sweater and sunglasses smiling\n",
"Getting embedding for A man in blue shorts and without a shirt is jogging down the road while listening to his iPod.\n",
"Getting embedding for The people are holding onto the rail.\n",
"Getting embedding for Two adults, one female in white, with shades and one male, gray clothes, walking across a street, away from a eatery with a blurred image of a dark colored red shirted person in the foreground.\n",
"Getting embedding for The city has a lot of people in it.\n",
"Getting embedding for there are some groups one of them of young females are in dresses strolling for a ramp walk\n",
"Getting embedding for The people are all jumping into a body of water.\n",
"Getting embedding for People sit and relax next to a pool in a plaza.\n",
"Getting embedding for A man wearing blue jeans and red bowling shoes stands in a bowling alley lane with a green ball in his hand.\n",
"Getting embedding for A boy sits at peoples feet.\n",
"Getting embedding for a man is looking at his webcam\n",
"Getting embedding for A man sits at a desk.\n",
"Getting embedding for there is a group of people waiting outside\n",
"Getting embedding for People in a meeting setting paying attention to a speaker in an orange shirt.\n",
"Getting embedding for one person sits while three stand near a body of water.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for Two men in wheelchairs are reaching in the air for a basketball.\n",
"Getting embedding for Tourists waiting at a train stop.\n",
"Getting embedding for there is a bmx biker who is perfect in ravine jumping\n",
"Getting embedding for Group of young women in dresses strolling on the sidewalk.\n",
"Getting embedding for Cheerleaders are on the field cheering.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for One biker is running with their bike while another is riding around them.\n",
"Getting embedding for two female medical personnel read their charts.\n",
"Getting embedding for Three men are smiling and posing behind a truck loaded with various construction supplies.\n",
"Getting embedding for Two people are next to each other.\n",
"Getting embedding for The woman is wearing white.\n",
"Getting embedding for Stacks of neatly folded clothing cover most of this floor while a woman with a beige shirt and jeans busily fills a suitcase.\n",
"Getting embedding for A young woman packs belongings into a black luggage carrier.\n",
"Getting embedding for BMX biker jumps over a ravine.\n",
"Getting embedding for Two old men in winter coats talk outside.\n",
"Getting embedding for There is a little boy in brown pants.\n",
"Getting embedding for Some humans in a truck\n",
"Getting embedding for two small girls walk along the leaves.\n",
"Getting embedding for A young woman is playing the violin.\n",
"Getting embedding for The man is cooking\n",
"Getting embedding for Two men in wheelchairs are reaching in the air for a basketball.\n",
"Getting embedding for three dogs are outside\n",
"Getting embedding for A man sits at a desk.\n",
"Getting embedding for Two men are on scaffolding as they paint above a storefront while a man on the sidewalk stands next to them talking on the phone.\n",
"Getting embedding for Two women, each embracing a little girl, catch up at a small family gathering.\n",
"Getting embedding for There is an individual waiting indoors.\n",
"Getting embedding for Two young men drink beer, leaning on a graffitied wall.\n",
"Getting embedding for People sit and relax next to a pool in a plaza.\n",
"Getting embedding for Uniformed men work.\n",
"Getting embedding for A man with a bright green shirt is talking to a woman in a pink shirt.\n",
"Getting embedding for Two men with heads down signing a paper.\n",
"Getting embedding for People in a meeting setting paying attention to a speaker in an orange shirt.\n",
"Getting embedding for a motorcyclist does a nose wheelie.\n",
"Getting embedding for There is an individual waiting indoors.\n",
"Getting embedding for Two guys cook using some rice milk.\n",
"Getting embedding for A lady at a table takes pictures.\n",
"Getting embedding for a woman in a red jacket watches as a black and brown dog runs away from her in woodland clearing.\n",
"Getting embedding for There is an individual waiting indoors.\n",
"Getting embedding for a motorcyclist does a nose wheelie.\n",
"Getting embedding for Three boys are in a body of water.\n",
"Getting embedding for A spotted black and white dog splashes in the water.\n",
"Getting embedding for Skydivers in formation.\n",
"Getting embedding for A woman is making a clay pot.\n",
"Getting embedding for People are on a bench\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for There is a little boy in brown pants.\n",
"Getting embedding for An excited, smiling woman stands at a red railing as she holds a boombox to one side.\n",
"Getting embedding for There are four Eastern guys working on and hanging pictures of humans.\n",
"Getting embedding for A man is hanging up a picture of a child.\n",
"Getting embedding for Three kids are sitting on a rock.\n",
"Getting embedding for The three young childeren were hold an apple with a bite on it\n",
"Getting embedding for There's a group of men hanging up a picture.\n",
"Getting embedding for Two people standing in front of a large statue of a woman, other statues and busts visible in the background.\n",
"Getting embedding for People are paying attention to the person hanging pictures\n",
"Getting embedding for A crowd gesticulates.\n",
"Getting embedding for People waiting to get on a train or just getting off.\n",
"Getting embedding for People are paying attention to the person hanging pictures\n",
"Getting embedding for Two adults walk across the street.\n",
"Getting embedding for Woman wearing a shirt sewing.\n",
"Getting embedding for Men fish on a concrete slab.\n",
"Getting embedding for A town has witnessed the arrival of three bikers.\n",
"Getting embedding for Motorcyclist performing while two men watch.\n",
"Getting embedding for Two little girls lie on the carpet next to an O made of wooden blocks.\n",
"Getting embedding for Two older men in winter coats talking outside of a shop with a grassy lawn covered in a light coat of snow in front of it.\n",
"Getting embedding for The women are exercising.\n",
"Getting embedding for The boy locked the cycle and went away.\n",
"Getting embedding for The people are all jumping into a body of water.\n",
"Getting embedding for There's a group of men hanging up a picture.\n",
"Getting embedding for They are walking with a sign.\n",
"Getting embedding for People are waiting for a race.\n",
"Getting embedding for Men watching motorcyclist.\n",
"Getting embedding for there is a bmx biker who is perfect in ravine jumping\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for A gentleman in a purple scarf and hat is looking at money while holding an accordion.\n",
"Getting embedding for Wet brown dog swims towards camera.\n",
"Getting embedding for There is a woman in a room.\n",
"Getting embedding for young people are gathered around a table\n",
"Getting embedding for One biker is running with their bike while another is riding around them.\n",
"Getting embedding for High fashion ladies wait outside a tram beside a crowd of people in the city.\n",
"Getting embedding for Two girls, each in a dress walking together.\n",
"Getting embedding for There is snow on the ground.\n",
"Getting embedding for People standing near sculptures\n",
"Getting embedding for The women and the man are on a bench.\n",
"Getting embedding for People are looking at sculptures at a museum.\n",
"Getting embedding for The young man is waiting with others on the sidewalk.\n",
"Getting embedding for A girl is enjoying herself.\n",
"Getting embedding for Firemen walking outside\n",
"Getting embedding for The people are walking outdoors.\n",
"Getting embedding for There are people just getting on a train\n",
"Getting embedding for Various people hanging around outside of a building.\n",
"Getting embedding for The two men sign something.\n",
"Getting embedding for One biker is running with their bike while another is riding around them.\n",
"Getting embedding for young people are gathered around a table\n",
"Getting embedding for The woman is wearing white.\n",
"Getting embedding for Two girls are playing outside.\n",
"Getting embedding for Group of young women in dresses strolling on the sidewalk.\n",
"Getting embedding for She plays in a band.\n",
"Getting embedding for Two older men in winter coats talking outside of a shop with a grassy lawn covered in a light coat of snow in front of it.\n",
"Getting embedding for Woman wearing a shirt sewing.\n",
"Getting embedding for A model is doing a shoot.\n",
"Getting embedding for the dog is in the water\n",
"Getting embedding for An older man is drinking orange juice at a restaurant.\n",
"Getting embedding for There are soccer players on the field.\n",
"Getting embedding for An old couple dance in by a juke box while a dude wearing shorts sleeps near a table\n",
"Getting embedding for Woman wearing a shirt sewing.\n",
"Getting embedding for A woman is lying down.\n",
"Getting embedding for A man, wearing a revolutionary apparel is making noise with a bell.\n",
"Getting embedding for The skier is wearing a jumpsuit.\n",
"Getting embedding for The person skiis\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for two small girls walk along the leaves.\n",
"Getting embedding for Two people walk away from a restaurant across a street.\n",
"Getting embedding for Cheerleaders cheering.\n",
"Getting embedding for Two people are having a conversation.\n",
"Getting embedding for Two people standing in front of a large statue of a woman, other statues and busts visible in the background.\n",
"Getting embedding for A woman has children with her at the check out counter.\n",
"Getting embedding for A dog is preparing to run away from a person interacting with it.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for Choir singing in mass.\n",
"Getting embedding for Three puppies are playing outside.\n",
"Getting embedding for someone in his twenties kicks at the ground\n",
"Getting embedding for there was a speed breaker on the road by which people are taking care\n",
"Getting embedding for An elderly couple dance in front of a juke box while a guy in shorts sleeps at a nearby table\n",
"Getting embedding for Boy in costume followed by a girl in costume.\n",
"Getting embedding for A dog drops a red disc on a beach.\n",
"Getting embedding for A person is painting.\n",
"Getting embedding for Three men are outside.\n",
"Getting embedding for Young woman in a cafe checking her cellphone.\n",
"Getting embedding for The man is holding a balloon.\n",
"Getting embedding for Children going home from school.\n",
"Getting embedding for The dogs run and play with a red ball.\n",
"Getting embedding for Little girl in a blue and yellow plaid outfit and blue hat is running along the trail.\n",
"Getting embedding for Soccer teams play on a field as the sun sets behind a line of trees.\n",
"Getting embedding for The women work in the health field.\n",
"Getting embedding for Children going home from school.\n",
"Getting embedding for The man is pushing a stroller.\n",
"Getting embedding for An average looking man is playing the guitar.\n",
"Getting embedding for The water was choppy as the man parasailed.\n",
"Getting embedding for There is a woman holding a baby, along with a man with a save the children bag.\n",
"Getting embedding for There are people in ChinaTown.\n",
"Getting embedding for Two Asian people sit at a blue table in a food court.\n",
"Getting embedding for a young man wearing a backpack and sunglasses is walking towards a shopping area.\n",
"Getting embedding for The child is outdoors in his bright colored shirt.\n",
"Getting embedding for Men are hanging something on the wall.\n",
"Getting embedding for Two older men are talking.\n",
"Getting embedding for There are lots of cars on the street.\n",
"Getting embedding for The adults are both male and female.\n",
"Getting embedding for People are near snow.\n",
"Getting embedding for group of people running\n",
"Getting embedding for Two guys cook using some rice milk.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for There are three girls\n",
"Getting embedding for Two men are around a bowl.\n",
"Getting embedding for A man is showing a woman something\n",
"Getting embedding for City people in street\n",
"Getting embedding for A woman in colorful native attire featuring a blue shirt with a colorful design displays her dark hair braided with red ribbons.\n",
"Getting embedding for A young man has his head on the table.\n",
"Getting embedding for couple walking\n",
"Getting embedding for Small boy in pool holds toy.\n",
"Getting embedding for The woman has one foot in the air.\n",
"Getting embedding for There are people on bicycles.\n",
"Getting embedding for Young woman running as two guys in the back try to catch up to her.\n",
"Getting embedding for Two girls lay next to wooden blocks.\n",
"Getting embedding for The child is outdoors in his bright colored shirt.\n",
"Getting embedding for There is a woman holding a baby, along with a man with a save the children bag.\n",
"Getting embedding for couple walking\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A man wearing blue jeans and red bowling shoes stands in a bowling alley lane with a green ball in his hand.\n",
"Getting embedding for A shot-on-goal action photo of soccer players in red and black uniforms.\n",
"Getting embedding for They are walking with a sign.\n",
"Getting embedding for Two people standing standing near a large statue, with other states nearby.\n",
"Getting embedding for Two old men in winter coats talk outside.\n",
"Getting embedding for people dance together\n",
"Getting embedding for Two sumo wrestlers compete in a match.\n",
"Getting embedding for A woman talks to two other women.\n",
"Getting embedding for A man riding a dirt bike\n",
"Getting embedding for The man is cooking\n",
"Getting embedding for A woman is making a clay pot.\n",
"Getting embedding for man sitting down playing a game of chess alone\n",
"Getting embedding for Two people walk away from a restaurant across a street.\n",
"Getting embedding for The men are drinking.\n",
"Getting embedding for A model is doing a shoot.\n",
"Getting embedding for Young woman in a cafe checking her cellphone.\n",
"Getting embedding for Woman at Walmart check-out having her groceries bagged by an employee.\n",
"Getting embedding for The woman is eating a banana.\n",
"Getting embedding for People waiting at a light on bikes.\n",
"Getting embedding for Two people standing in front of a large statue of a woman, other statues and busts visible in the background.\n",
"Getting embedding for A male painting a scene in front of him.\n",
"Getting embedding for They are walking with a sign.\n",
"Getting embedding for Two older men are talking.\n",
"Getting embedding for Asian city scene of people in street with bright lights and glass buildings behind.\n",
"Getting embedding for Men watching motorcyclist.\n",
"Getting embedding for The person skiis\n",
"Getting embedding for the kid has milk on his face\n",
"Getting embedding for Two men are around a bowl.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A woman is making a clay pot.\n",
"Getting embedding for Two young girls are playing outside in a non-urban environment.\n",
"Getting embedding for Mothers with children talking at a gathering.\n",
"Getting embedding for Two Asian people sit at a blue table in a food court.\n",
"Getting embedding for the woman is outside\n",
"Getting embedding for A young lady is looking at her camera.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for Two men are on scaffolding as they paint above a storefront while a man on the sidewalk stands next to them talking on the phone.\n",
"Getting embedding for two little girls, one in a green jacket and one in a pink jacket, and a little boy in a green jacket holding an apple sitting on a rock.\n",
"Getting embedding for a motorcyclist does a nose wheelie.\n",
"Getting embedding for Two men are standing outside and snow is on the ground.\n",
"Getting embedding for A group of people sitting at a table outside talking.\n",
"Getting embedding for Two people are next to each other.\n",
"Getting embedding for A man with a gray shirt holds a young infant in his hands.\n",
"Getting embedding for There is a little boy in brown pants.\n",
"Getting embedding for a woman in a red jacket watches as a black and brown dog runs away from her in woodland clearing.\n",
"Getting embedding for Three dogs in different shades of brown and white biting and licking each other.\n",
"Getting embedding for People are skydiving.\n",
"Getting embedding for The man walked alongside the crew.\n",
"Getting embedding for The child is outdoors in his bright colored shirt.\n",
"Getting embedding for A town has witnessed the arrival of three bikers.\n",
"Getting embedding for The people are near the table.\n",
"Getting embedding for Three children pose for a picture.\n",
"Getting embedding for There is a family taking a walk outside.\n",
"Getting embedding for A person on skis on a rail at night.\n",
"Getting embedding for Various people hanging around outside of a building.\n",
"Getting embedding for An expectant woman happily lets another listen to the baby inside of her.\n",
"Getting embedding for The young violinist is a woman.\n",
"Getting embedding for People in a meeting setting paying attention to a speaker in an orange shirt.\n",
"Getting embedding for Pitcher is winding up a throw\n",
"Getting embedding for a couple are holding hands behind their backs while walking down a street, and the man has his arm around her shoulder.\n",
"Getting embedding for The mothers are having conversations.\n",
"Getting embedding for two little girls, one in a green jacket and one in a pink jacket, and a little boy in a green jacket holding an apple sitting on a rock.\n",
"Getting embedding for A soccer match between a team with white jerseys, and a team with yellow jerseys.\n",
"Getting embedding for some car passing outside\n",
"Getting embedding for The woman is outdoors with a machine.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for There are people sitting down.\n",
"Getting embedding for Older couple posing for a picture in front of a fountain.\n",
"Getting embedding for Near a hanging plant, a metal bowl was used to cook by two mens\n",
"Getting embedding for A man is holding an accordian.\n",
"Getting embedding for The women are exercising.\n",
"Getting embedding for A dog swims towards the camera.\n",
"Getting embedding for Busy ChinaTown street corner where people are walking past an open front store.\n",
"Getting embedding for The couple danced.\n",
"Getting embedding for The cars are outside.\n",
"Getting embedding for Three wheelchair basketball players playing basketball in field.\n",
"Getting embedding for A man and two women in black jackets holding umbrellas sit on a long wooden bench.\n",
"Getting embedding for The two people are dancing.\n",
"Getting embedding for There is a little boy in brown pants.\n",
"Getting embedding for The man knows how to play guitar.\n",
"Getting embedding for A young man doing a trick on a skateboard down the stairs while being photographed.\n",
"Getting embedding for The men are drinking.\n",
"Getting embedding for Two pre-teen girls listening to music on an MP3 player with headphones.\n",
"Getting embedding for Two men are around a bowl.\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for two female medical personnel read their charts.\n",
"Getting embedding for Several people are dancing together in sync.\n",
"Getting embedding for People are on their bikes.\n",
"Getting embedding for Two older men in winter coats talking outside of a shop with a grassy lawn covered in a light coat of snow in front of it.\n",
"Getting embedding for Three puppies are playing outside.\n",
"Getting embedding for An excited, smiling woman stands at a red railing as she holds a boombox to one side.\n",
"Getting embedding for Emergency personnel looking into the back of a car.\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for People waiting to get on a train or just getting off.\n",
"Getting embedding for Two men are on scaffolding as they paint above a storefront while a man on the sidewalk stands next to them talking on the phone.\n",
"Getting embedding for Man riding bike\n",
"Getting embedding for The women are exercising.\n",
"Getting embedding for Three wheelchair basketball players playing basketball in field.\n",
"Getting embedding for There is a city.\n",
"Getting embedding for Three puppies are playing outside.\n",
"Getting embedding for A woman is walking across the street eating a banana, while a man is following with his briefcase.\n",
"Getting embedding for A meeting of young people sitting at a conference table.\n",
"Getting embedding for There are people just getting on a train\n",
"Getting embedding for An animal is jumping in a place that is not hot.\n",
"Getting embedding for A dog drops a red disc on a beach.\n",
"Getting embedding for A person on a horse jumps over a broken down airplane.\n",
"Getting embedding for The surfer catches a big wave but stays on his board.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for A group of people are playing a game of soccer.\n",
"Getting embedding for Two men are standing outside and snow is on the ground.\n",
"Getting embedding for A small white dog running on a pebble covered beach.\n",
"Getting embedding for Two sumo wrestlers compete in a match.\n",
"Getting embedding for Men in uniform work together.\n",
"Getting embedding for someone in his twenties kicks at the ground\n",
"Getting embedding for A smiling man cooks something delicious.\n",
"Getting embedding for Some people are cheering on a field.\n",
"Getting embedding for The furry brown dog is swimming in the ocean.\n",
"Getting embedding for A woman talks to others indoors.\n",
"Getting embedding for The cart is filled to the top.\n",
"Getting embedding for People are waiting for a race.\n",
"Getting embedding for Two sumo wrestlers compete in a match.\n",
"Getting embedding for There are four Eastern guys working on and hanging pictures of humans.\n",
"Getting embedding for The people are near the table.\n",
"Getting embedding for There is a city.\n",
"Getting embedding for A man parasails in the choppy water.\n",
"Getting embedding for The skier is wearing a yellow jumpsuit and sliding across a yellow rail.\n",
"Getting embedding for A small white dog running on a pebble covered beach.\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for young people are gathered around a table\n",
"Getting embedding for There are four Eastern guys working on and hanging pictures of humans.\n",
"Getting embedding for The chorus is singing.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for The school children head home.\n",
"Getting embedding for A woman in a blue shirt and green hat looks up at the camera.\n",
"Getting embedding for Uniformed men work.\n",
"Getting embedding for Two tan and white dogs and one tan dog racing down the beach near the water.\n",
"Getting embedding for Two men stand around a mixing bowl.\n",
"Getting embedding for A model is doing a shoot.\n",
"Getting embedding for The city has a lot of people in it.\n",
"Getting embedding for Toddler in striped sweatshirt plays on rope on playground.\n",
"Getting embedding for Children playing a game.\n",
"Getting embedding for The bicyclists are outside.\n",
"Getting embedding for There are people next to sculptures.\n",
"Getting embedding for A boy in a shirt plays outside.\n",
"Getting embedding for Workers are on break.\n",
"Getting embedding for A woman has her back to the camera.\n",
"Getting embedding for Two people are looking at a clipboard.\n",
"Getting embedding for A man wants a woman to look at his clipboard\n",
"Getting embedding for The boy is wearing a shirt.\n",
"Getting embedding for The child is painting.\n",
"Getting embedding for One person running next to their bike with the person riding their bike behind them.\n",
"Getting embedding for Two men are barefoot on the lawn.\n",
"Getting embedding for A soccer game is in progress.\n",
"Getting embedding for A family walks along a dirt path.\n",
"Getting embedding for There is a man and a woman.\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for A man in a colorful shirt is playing an instrument.\n",
"Getting embedding for the man is working on the computer\n",
"Getting embedding for the person is hanging pictures.\n",
"Getting embedding for A man is outside.\n",
"Getting embedding for The white and brown dog is in the air.\n",
"Getting embedding for people sit around table\n",
"Getting embedding for A woman stand on a fountain and dips her toes in.\n",
"Getting embedding for The shop sign says \"Welcome to Golden\"\n",
"Getting embedding for The man is drinking water.\n",
"Getting embedding for There are two woman in this picture.\n",
"Getting embedding for A child.\n",
"Getting embedding for Someone is on top of a cart full of items, while someone else observes.\n",
"Getting embedding for A log is being painted by a child.\n",
"Getting embedding for There are people in front of buildings that are brightly lit.\n",
"Getting embedding for A boy is wearing a shirt\n",
"Getting embedding for a person in orange\n",
"Getting embedding for A man is windsurfing.\n",
"Getting embedding for The child was walking near the grass making a funny face.\n",
"Getting embedding for THe woman is sitting down\n",
"Getting embedding for The racer is driving.\n",
"Getting embedding for The red and black team are playing a game.\n",
"Getting embedding for a guy is performing a bike trick\n",
"Getting embedding for The firemen are gathered one is holding his helmet.\n",
"Getting embedding for There is a girl standing\n",
"Getting embedding for A person looks in her purse at a restaurant.\n",
"Getting embedding for People are playing soccer.\n",
"Getting embedding for The cheerleaders are wearing clothes.\n",
"Getting embedding for People are about to eat.\n",
"Getting embedding for pwople are on stage\n",
"Getting embedding for A person holding a boombox.\n",
"Getting embedding for The child is happy.\n",
"Getting embedding for People are playing a sport.\n",
"Getting embedding for An elderly woman puts carrots into a casserole\n",
"Getting embedding for There is a man in front of the shop.\n",
"Getting embedding for A pair of dogs tease a third with nibbles.\n",
"Getting embedding for Two animals getting to know each other.\n",
"Getting embedding for A woman in white.\n",
"Getting embedding for A car drives through the water.\n",
"Getting embedding for The people stretched on yoga mats.\n",
"Getting embedding for The boy is making a mess.\n",
"Getting embedding for A human walks up some stairs.\n",
"Getting embedding for People are working.\n",
"Getting embedding for Two kids are running.\n",
"Getting embedding for A few people share a bench.\n",
"Getting embedding for A lady looks at a picture on her camera\n",
"Getting embedding for Three men looking at a car.\n",
"Getting embedding for Children playing soccer while the sun sets.\n",
"Getting embedding for The old man is standing outside of a building.\n",
"Getting embedding for An animal is walking outside.\n",
"Getting embedding for The people are doing yoga\n",
"Getting embedding for A woman is talking to children.\n",
"Getting embedding for Kids pose in front of a mountain background.\n",
"Getting embedding for The man is playing music on an instrument.\n",
"Getting embedding for Kids are playing outdoors.\n",
"Getting embedding for Someone sitting outside behind a chessboard.\n",
"Getting embedding for A man doing a wheelie\n",
"Getting embedding for People are fishing and walking next to the water.\n",
"Getting embedding for A man is photographing another man.\n",
"Getting embedding for People are having a discussion.\n",
"Getting embedding for The boy is young.\n",
"Getting embedding for A woman walking outside.\n",
"Getting embedding for People are outside.\n",
"Getting embedding for People are waiting to eat.\n",
"Getting embedding for Three men are grouped around the back of a car.\n",
"Getting embedding for The cowboy waved to the crowd.\n",
"Getting embedding for Two men serving food.\n",
"Getting embedding for A dog is outside.\n",
"Getting embedding for A woman on top of her clothes.\n",
"Getting embedding for A group of people are outside.\n",
"Getting embedding for A group of people gathering on the grass.\n",
"Getting embedding for A man is outdoors.\n",
"Getting embedding for A cowboy rides a bull at a rodeo.\n",
"Getting embedding for A guy stands on stage with his guitar.\n",
"Getting embedding for The workers are standing still.\n",
"Getting embedding for A group of bikers are in the street.\n",
"Getting embedding for A female is next to a man.\n",
"Getting embedding for Two adults walking across a road\n",
"Getting embedding for A groups of people acts on stage.\n",
"Getting embedding for Two people are next to a fountain together.\n",
"Getting embedding for A woman is at a machine.\n",
"Getting embedding for Schoolchildren together\n",
"Getting embedding for The Seiko building is large.\n",
"Getting embedding for People are in the street.\n",
"Getting embedding for A yoga class is in progress.\n",
"Getting embedding for There is a soccer game.\n",
"Getting embedding for There are bicyclists stopped at a road.\n",
"Getting embedding for An Asian woman is smiling at while another lady is rowing.\n",
"Getting embedding for There are some guys in this picture\n",
"Getting embedding for A firefighter sets up tape in a city\n",
"Getting embedding for The girl blows a butterfly.\n",
"Getting embedding for Children enjoy playing together.\n",
"Getting embedding for A classroom is discussing the topics of the day.\n",
"Getting embedding for A group of people point forwards while doing something.\n",
"Getting embedding for the bike is tied to a sign\n",
"Getting embedding for A man plays bowling.\n",
"Getting embedding for People in orange vests await the beginning of a race.\n",
"Getting embedding for A city filled with people in the middle of the daytime.\n",
"Getting embedding for A fireman is working hard to keep people safe.\n",
"Getting embedding for A woman is wearing a green sweatshirt.\n",
"Getting embedding for A man is running behind a sled.\n",
"Getting embedding for Someone is wearing formal clothes.\n",
"Getting embedding for Children are jumping rope.\n",
"Getting embedding for man ringing a bell\n",
"Getting embedding for A young girl has a bowl on her head\n",
"Getting embedding for Brown dog treads through water as he is soaked in water\n",
"Getting embedding for People are near water.\n",
"Getting embedding for The man has something to tell the woman.\n",
"Getting embedding for The people are by the wall.\n",
"Getting embedding for Peole stand by a building supply truck.\n",
"Getting embedding for The man is fertilizering his garden.\n",
"Getting embedding for A school is hosting an event.\n",
"Getting embedding for People playing jump rope.\n",
"Getting embedding for Lady sits gazing at a camera.\n",
"Getting embedding for The man rides an animal.\n",
"Getting embedding for The boy does a skateboarding trick.\n",
"Getting embedding for The man is outside.\n",
"Getting embedding for A man is wearing something with writing on it.\n",
"Getting embedding for A woman preparing to glaze\n",
"Getting embedding for A decorated man sees a scantily clad female.\n",
"Getting embedding for There are three men working.\n",
"Getting embedding for A male is getting a drink of water.\n",
"Getting embedding for People are standing on a grassy field\n",
"Getting embedding for A person throwing something for her dog.\n",
"Getting embedding for A lady wearing a blue shirt.\n",
"Getting embedding for The woman throws a Frisbee to the dog.\n",
"Getting embedding for A person looks up.\n",
"Getting embedding for A group of children are posing.\n",
"Getting embedding for Four guys are playing basketball.\n",
"Getting embedding for One man writes on papers, while another man stands by.\n",
"Getting embedding for A dog with an object in it's mouth is in the water.\n",
"Getting embedding for Two adults walk across a street.\n",
"Getting embedding for A foreign family walks by a dirt trail along a body of water.\n",
"Getting embedding for Pirate on the sidewalk\n",
"Getting embedding for a old man was talking\n",
"Getting embedding for A man stands at the bottom of the stairs.\n",
"Getting embedding for There is a soccer game with a team in yellow.\n",
"Getting embedding for There is a person processing vegetables.\n",
"Getting embedding for The person stares off into the distance.\n",
"Getting embedding for The man is laying down to sleep\n",
"Getting embedding for A street performer is trying to earn extra money.\n",
"Getting embedding for Two men are laughing and enjoying themselves.\n",
"Getting embedding for Two people are outside.\n",
"Getting embedding for There are scultupres nearby.\n",
"Getting embedding for She is packing.\n",
"Getting embedding for a woman looking at her cellphone\n",
"Getting embedding for Bicyclists waiting their turn to cross.\n",
"Getting embedding for People pose for a picture.\n",
"Getting embedding for A photographer takes a picture of the boy's parents by the fountain.\n",
"Getting embedding for A child was making a mess with milk.\n",
"Getting embedding for The family is outside.\n",
"Getting embedding for A family of foreigners walks by the water.\n",
"Getting embedding for A woman is like to touch the water in fountain\n",
"Getting embedding for There are people waiting on a train.\n",
"Getting embedding for People are walking outdoors.\n",
"Getting embedding for A car is flooding.\n",
"Getting embedding for Man sitting on a motorcycle on the sidewalk\n",
"Getting embedding for A dog is running outdoors.\n",
"Getting embedding for two people by a fountain\n",
"Getting embedding for The toddler has milk around the corners of his mouth.\n",
"Getting embedding for A young lady playing in front of the capitol building.\n",
"Getting embedding for The old man is painting a portrait.\n",
"Getting embedding for The woman and man are outdoors.\n",
"Getting embedding for A baby walks on the ground.\n",
"Getting embedding for The children are playing in a rocky field.\n",
"Getting embedding for the dogs see each other\n",
"Getting embedding for a woman and a german shepherd are pictured\n",
"Getting embedding for a woman is talking\n",
"Getting embedding for The woman in green and pink is dancing.\n",
"Getting embedding for They are outside wearing coats.\n",
"Getting embedding for A man makes a ruckus.\n",
"Getting embedding for Near a couple of restaurants, two people walk across the street.\n",
"Getting embedding for People are wearing colorful clothes\n",
"Getting embedding for A cart is full of items.\n",
"Getting embedding for Some men are sitting outdoors.\n",
"Getting embedding for A soccer player jumping up while a game is in progess.\n",
"Getting embedding for A blond man drinking water from a fountain.\n",
"Getting embedding for They are avoiding trees.\n",
"Getting embedding for Man looking at the camera.\n",
"Getting embedding for A couple are having a conversation\n",
"Getting embedding for A man photographs a skateboarder doing tricks.\n",
"Getting embedding for A man is pulling on a rope.\n",
"Getting embedding for Four people congregate near the water.\n",
"Getting embedding for A man is taking photos of skateboarding tricks.\n",
"Getting embedding for Firemen are walking.\n",
"Getting embedding for The man is outside sledding.\n",
"Getting embedding for A person is indoors.\n",
"Getting embedding for A woman is wearing an apron.\n",
"Getting embedding for There are some people outside.\n",
"Getting embedding for Workers are resting during a meal break.\n",
"Getting embedding for The dog is in the snow.\n",
"Getting embedding for Girls and boys are having fun outdoors\n",
"Getting embedding for People are playing a sport in honor of crippled people.\n",
"Getting embedding for The diners are at a restaurant.\n",
"Getting embedding for A lady with a serious face is standing with two guys in front of steps outside.\n",
"Getting embedding for The family is admiring the water\n",
"Getting embedding for A dog is catching a stick.\n",
"Getting embedding for A girl is picking an item up.\n",
"Getting embedding for The man with the gray hat and pitchfork is directing the cart.\n",
"Getting embedding for More than one person on a bicycle is obeying traffic laws.\n",
"Getting embedding for Community members are spending time in the park near a foundtain.\n",
"Getting embedding for A male sitting indoors.\n",
"Getting embedding for the child is working with wood.\n",
"Getting embedding for A person jumps in the air.\n",
"Getting embedding for Two kids wearing costumes are outside.\n",
"Getting embedding for A person on a bike is near a street.\n",
"Getting embedding for A man is playing a game\n",
"Getting embedding for People cheering.\n",
"Getting embedding for There is a windsurfer balancing on choppy water.\n",
"Getting embedding for The goalie wants to prevent a goal.\n",
"Getting embedding for The child is painting.\n",
"Getting embedding for A woman is looking at a man's possessions\n",
"Getting embedding for man playing soccer\n",
"Getting embedding for The man is able to grow a beard.\n",
"Getting embedding for someone is playing an instrument\n",
"Getting embedding for Human rides two wheeled vehicle.\n",
"Getting embedding for The people are outside.\n",
"Getting embedding for A boy in multi-colored shirt hold his arms out from his sides\n",
"Getting embedding for A man is sitting on a motorcycle.\n",
"Getting embedding for A female adult is near some kids.\n",
"Getting embedding for A guy is driving a dirt bike.\n",
"Getting embedding for There are people watching another person hang up pictures.\n",
"Getting embedding for A kite surfer is falling\n",
"Getting embedding for The woman is on a trolley.\n",
"Getting embedding for The bikers are in the town.\n",
"Getting embedding for A man cooks.\n",
"Getting embedding for A man is wearing a cap\n",
"Getting embedding for A woman walked pasted the front of a building.\n",
"Getting embedding for the runners waited to start the race\n",
"Getting embedding for The people are moving.\n",
"Getting embedding for There are some people in a street\n",
"Getting embedding for There are women showing affection.\n",
"Getting embedding for A view of a crowed place in an asian country.\n",
"Getting embedding for The people are outdoors.\n",
"Getting embedding for An old man is enjoying a beverage at a cafe.\n",
"Getting embedding for A woman is interacting with a dog.\n",
"Getting embedding for Man in blue glasses walking pass a building\n",
"Getting embedding for Two women hug each other.\n",
"Getting embedding for A woman with a yellow to sits.\n",
"Getting embedding for A man is on a dirt bike.\n",
"Getting embedding for a woman eating a banana crosses a street\n",
"Getting embedding for A man pulls on a rope.\n",
"Getting embedding for A soccer game.\n",
"Getting embedding for Cheerleaders cheer on a field for an activity.\n",
"Getting embedding for Two kids are playing with a big rock in the field\n",
"Getting embedding for Bikers stop in towns\n",
"Getting embedding for the woman is wearing a red shirt.\n",
"Getting embedding for The buildings are tall.\n",
"Getting embedding for a child was there\n",
"Getting embedding for Some women are talking.\n",
"Getting embedding for A middle aged oriental woman in a green headscarf and blue shirt is flashing a giant smile\n",
"Getting embedding for A couple is in a hot tub.\n",
"Getting embedding for A small girl with a necklace is in the water\n",
"Getting embedding for A boy is holding his arms out.\n",
"Getting embedding for A blonde woman looks for things in a suitcase.\n",
"Getting embedding for a man is photographing a man skateboarding.\n",
"Getting embedding for The crowd looked on while the players prepared themselves.\n",
"Getting embedding for A man is taking the picture of a skateboarder who is performing a trick.\n",
"Getting embedding for A man sits in front of a set up chess game.\n",
"Getting embedding for People take photos outdoors while a man performs exciting skateboarding tricks.\n",
"Getting embedding for An old man is standing by a building in downtown.\n",
"Getting embedding for lots of people are in the street\n",
"Getting embedding for People wait on traffic.\n",
"Getting embedding for The young man has glasses on his face.\n",
"Getting embedding for The couple is outdoors.\n",
"Getting embedding for The dog is running.\n",
"Getting embedding for The child is painting the wood.\n",
"Getting embedding for The man is outside.\n",
"Getting embedding for The man plays guitar\n",
"Getting embedding for The guys are playing a game.\n",
"Getting embedding for A man is walking with his horse up a country road.\n",
"Getting embedding for Someone is filming.\n",
"Getting embedding for A man is making a loud noise.\n",
"Getting embedding for A man is bowling.\n",
"Getting embedding for A person is dipping her foot into water.\n",
"Getting embedding for A man is wakeboarding.\n",
"Getting embedding for The man is putting up a poster.\n",
"Getting embedding for Firefighters are checking a car.\n",
"Getting embedding for dogs attacking another dog\n",
"Getting embedding for The girl is under the age of 88 years old.\n",
"Getting embedding for people are together\n",
"Getting embedding for A man is standing in front of a shop.\n",
"Getting embedding for There are people at work.\n",
"Getting embedding for A picture of a city is on a street\n",
"Getting embedding for A man is holding a girls hand and walking through a creek.\n",
"Getting embedding for A man is wearing blue.\n",
"Getting embedding for The boy mugs for the camera.\n",
"Getting embedding for There are children present\n",
"Getting embedding for The girl is sitting.\n",
"Getting embedding for A person and their pet are outdoors\n",
"Getting embedding for The person is interested in a water jet.\n",
"Getting embedding for Toddler wearing mik\n",
"Getting embedding for A man waiting with his computer.\n",
"Getting embedding for The baby is playing.\n",
"Getting embedding for Two kids are with a wagon.\n",
"Getting embedding for People with bikes.\n",
"Getting embedding for A child plays at a park.\n",
"Getting embedding for A happy woman smiling\n",
"Getting embedding for A dog is fetching a stick out of very clear water.\n",
"Getting embedding for A man has facial hair.\n",
"Getting embedding for The couple is dancing together.\n",
"Getting embedding for A couple taking a picture\n",
"Getting embedding for A family of three is at the beach.\n",
"Getting embedding for The man is standing.\n",
"Getting embedding for A man is bowling.\n",
"Getting embedding for a bearded man pulls a rope\n",
"Getting embedding for A man is working on a bike.\n",
"Getting embedding for Several children are jumping rope in the middle of a road while other kids watch\n",
"Getting embedding for Several people in an alleyway.\n",
"Getting embedding for A man resting on a street.\n",
"Getting embedding for The biker is jumping into a hole.\n",
"Getting embedding for Grafffiti on a brick wall.\n",
"Getting embedding for The brightly dressed skier slid down the race course.\n",
"Getting embedding for Somebody is engaging in winter sports.\n",
"Getting embedding for A man shows a woman something.\n",
"Getting embedding for The man is outside.\n",
"Getting embedding for a group of men and women converse\n",
"Getting embedding for Men and women outside on a street corner.\n",
"Getting embedding for The man is smoking something while sitting on the scooter.\n",
"Getting embedding for Some women are reading.\n",
"Getting embedding for They are avoiding trees.\n",
"Getting embedding for The people are moving.\n",
"Getting embedding for The buildings are tall.\n",
"Getting embedding for The Seiko building is large.\n",
"Getting embedding for a group of men and women converse\n",
"Getting embedding for two people by a fountain\n",
"Getting embedding for Gray dog running down pavement toward laundry line in courtyard.\n",
"Getting embedding for Three men, two wearing yellow suits, are looking in the back of a car.\n",
"Getting embedding for A woman wearing all white and eating, walks next to a man holding a briefcase.\n",
"Getting embedding for A man makes a ruckus.\n",
"Getting embedding for The cowboy waved to the crowd.\n",
"Getting embedding for There is a windsurfer balancing on choppy water.\n",
"Getting embedding for A person is a red hat and winter jacket is looking into the distance.\n",
"Getting embedding for The man rides an animal.\n",
"Getting embedding for man playing soccer\n",
"Getting embedding for Man in blue glasses walking pass a building\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for A young man in blue sunglasses walking in front of a red brick building.\n",
"Getting embedding for The girl is under the age of 88 years old.\n",
"Getting embedding for There is a windsurfer balancing on choppy water.\n",
"Getting embedding for The shop sign says \"Welcome to Golden\"\n",
"Getting embedding for People are stretching on yoga mats.\n",
"Getting embedding for A white and brown dog is leaping through the air.\n",
"Getting embedding for There are bicyclists stopped at a road.\n",
"Getting embedding for Children enjoy playing together.\n",
"Getting embedding for The child is happy.\n",
"Getting embedding for Workers are resting during a meal break.\n",
"Getting embedding for There are two woman in this picture.\n",
"Getting embedding for Grafffiti on a brick wall.\n",
"Getting embedding for Human rides two wheeled vehicle.\n",
"Getting embedding for There are people outdoors.\n",
"Getting embedding for The people stretched on yoga mats.\n",
"Getting embedding for Three men, two wearing yellow suits, are looking in the back of a car.\n",
"Getting embedding for There are people watching another person hang up pictures.\n",
"Getting embedding for Three young children consisting of two girls and a boy who is holding an apple with a bite out of it, are posing on a scenic mountain view background.\n",
"Getting embedding for A girl wearing a blue shirt, shorts, and sneakers is seated on a stool at a round table, looking at her phone.\n",
"Getting embedding for There is a soccer game with a team in yellow.\n",
"Getting embedding for The man is outside sledding.\n",
"Getting embedding for A group of adults is having a discussion at a table under a tent.\n",
"Getting embedding for Lady sits gazing at a camera.\n",
"Getting embedding for There are bicyclists stopped at a road.\n",
"Getting embedding for The couple is outdoors.\n",
"Getting embedding for A man and a woman are standing next to sculptures, talking while another man looks at other sculptures.\n",
"Getting embedding for a child was there\n",
"Getting embedding for Workers are eating a meal while one man sits on a pile of plywood.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for Man smokes while sitting on a parked scooter.\n",
"Getting embedding for a woman eating a banana crosses a street\n",
"Getting embedding for They are avoiding trees.\n",
"Getting embedding for The people are by the wall.\n",
"Getting embedding for The old man is standing outside of a building.\n",
"Getting embedding for Someone is filming.\n",
"Getting embedding for Two animals getting to know each other.\n",
"Getting embedding for the dogs see each other\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for Two men trying to build something together, while having fun.\n",
"Getting embedding for a woman and a german shepherd are pictured\n",
"Getting embedding for Somebody is engaging in winter sports.\n",
"Getting embedding for The woman throws a Frisbee to the dog.\n",
"Getting embedding for The bikers are in the town.\n",
"Getting embedding for White small child wearing a brown and gray striped hoodie plays at park.\n",
"Getting embedding for A man dressed in snow-gear takes a leap into a snow-covered ravine.\n",
"Getting embedding for People are fishing and walking next to the water.\n",
"Getting embedding for The workers are standing still.\n",
"Getting embedding for People pose for a picture.\n",
"Getting embedding for Three firefighters, the nearest firefighter is holding a helmet in his left hand.\n",
"Getting embedding for People are having a discussion.\n",
"Getting embedding for Children's soccer game being played while the sun sets in the background.\n",
"Getting embedding for three bikers stop in town.\n",
"Getting embedding for The people are by the wall.\n",
"Getting embedding for a woman on a yellow shirt is on the floor.\n",
"Getting embedding for Two people enjoying a water fountain display.\n",
"Getting embedding for Firemen are walking.\n",
"Getting embedding for A person and their pet are outdoors\n",
"Getting embedding for People are in the street.\n",
"Getting embedding for People are about to eat.\n",
"Getting embedding for A street performer is trying to earn extra money.\n",
"Getting embedding for A person is hanging up pictures of women with a few onlookers watching surrounded by bikes.\n",
"Getting embedding for A group of people are doing yoga.\n",
"Getting embedding for A smiling lady in a green jacket at a public gathering.\n",
"Getting embedding for Two animals getting to know each other.\n",
"Getting embedding for Men and women outside on a street corner.\n",
"Getting embedding for There is a person processing vegetables.\n",
"Getting embedding for The red and black team are playing a game.\n",
"Getting embedding for A man is windsurfing.\n",
"Getting embedding for THe woman is sitting down\n",
"Getting embedding for A man, woman, and child enjoying themselves on a beach.\n",
"Getting embedding for People are having a discussion.\n",
"Getting embedding for A woman in a blue shirt is sitting at a table and looking at her cellphone.\n",
"Getting embedding for There is a soccer game.\n",
"Getting embedding for Two large dogs greet other while their owners watch.\n",
"Getting embedding for A small girl dressed in a yellow dress with flowers on it bends over near a large pile of watermelons.\n",
"Getting embedding for People are near water.\n",
"Getting embedding for two people by a fountain\n",
"Getting embedding for Child in red and blue shirt painting a log.\n",
"Getting embedding for A street performer is trying to earn extra money.\n",
"Getting embedding for The man is outside sledding.\n",
"Getting embedding for People walking around in a big city.\n",
"Getting embedding for a child is pushing another kid in a wheeler dressed in a red top and wearing a cap.\n",
"Getting embedding for A man holds a clipboard and a pen as a woman looks at them.\n",
"Getting embedding for The man is fertilizering his garden.\n",
"Getting embedding for The white and brown dog is in the air.\n",
"Getting embedding for People pose for a picture.\n",
"Getting embedding for Lady sits gazing at a camera.\n",
"Getting embedding for A woman talks to two other women and a man with notepads in an office building with large windows.\n",
"Getting embedding for A woman talking to four little children outside.\n",
"Getting embedding for There are some guys in this picture\n",
"Getting embedding for A middle aged oriental woman in a green headscarf and blue shirt is flashing a giant smile\n",
"Getting embedding for The young man has glasses on his face.\n",
"Getting embedding for Toddler wearing mik\n",
"Getting embedding for A person is a red hat and winter jacket is looking into the distance.\n",
"Getting embedding for Workers are eating a meal while one man sits on a pile of plywood.\n",
"Getting embedding for a woman with a straw hat working on a strange machine with coconuts at her side.\n",
"Getting embedding for The person stares off into the distance.\n",
"Getting embedding for The dog is in the snow.\n",
"Getting embedding for dogs attacking another dog\n",
"Getting embedding for Two little kids showing their American pride in their star spangled wagon.\n",
"Getting embedding for People are walking outdoors.\n",
"Getting embedding for A small girl stands among many large watermelons.\n",
"Getting embedding for A man wearing a gray cap is looking down.\n",
"Getting embedding for The Seiko building is large.\n",
"Getting embedding for People are standing on a grassy field\n",
"Getting embedding for Young blond woman putting her foot into a water fountain\n",
"Getting embedding for Young people playing with a long jump rope in the street.\n",
"Getting embedding for Several people in an alleyway.\n",
"Getting embedding for Two kids are playing with a big rock in the field\n",
"Getting embedding for Two adults walking across a road\n",
"Getting embedding for The bikers are in the town.\n",
"Getting embedding for The red team knocked the ball toward the goal and the black team tried to block it.\n",
"Getting embedding for Two people are looking at a clipboard.\n",
"Getting embedding for A man is wakeboarding.\n",
"Getting embedding for The red and black team are playing a game.\n",
"Getting embedding for In a bowling alley, a man holding a green bowling ball looks ahead at the pins that he must knock down.\n",
"Getting embedding for Man smokes while sitting on a parked scooter.\n",
"Getting embedding for Two women are talking while children are sitting on their laps.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for A man with wild hair rocks a show playing a guitar center stage.\n",
"Getting embedding for two men serving preparing food.\n",
"Getting embedding for Two people with bicycles, one in front running with a bike and one in back riding.\n",
"Getting embedding for Workers are resting during a meal break.\n",
"Getting embedding for A woman wearing an apron inspects a large pot on a table filled with cups, bowls, pots and baskets of assorted size.\n",
"Getting embedding for A man is putting up a poster in front of a shop.\n",
"Getting embedding for a lone person jumping through the air from one snowy mountain to another.\n",
"Getting embedding for The toddler has milk around the corners of his mouth.\n",
"Getting embedding for Some men are sitting outdoors.\n",
"Getting embedding for An older man dressed in blue historical clothing is ringing a bell in his right hand.\n",
"Getting embedding for Two people pose for the camera.\n",
"Getting embedding for A classroom is discussing the topics of the day.\n",
"Getting embedding for People wait on traffic.\n",
"Getting embedding for A woman is wearing a green sweatshirt.\n",
"Getting embedding for The people are outside.\n",
"Getting embedding for Two men are barefoot on the lawn.\n",
"Getting embedding for Somebody is engaging in winter sports.\n",
"Getting embedding for Children playing a game.\n",
"Getting embedding for A young toddler wearing pink sandals is walking on hopscotch numbers.\n",
"Getting embedding for Two kids are playing with a big rock in the field\n",
"Getting embedding for A man with a beard, wearing a red shirt with gray sleeves and work gloves, pulling on a rope.\n",
"Getting embedding for A soccer game.\n",
"Getting embedding for Four people near a body of water, one sitting and three standing, while two people walk on a nearby sidewalk.\n",
"Getting embedding for The white and brown dog is in the air.\n",
"Getting embedding for People wait on traffic.\n",
"Getting embedding for An old man wearing khaki pants and a brown shirt standing on the sidewalk in front of a building.\n",
"Getting embedding for a motorcyclist does a nose wheelie.\n",
"Getting embedding for A woman in a blue shirt is sitting at a table and looking at her cellphone.\n",
"Getting embedding for Somebody is engaging in winter sports.\n",
"Getting embedding for Two women who just had lunch hugging and saying goodbye.\n",
"Getting embedding for a guy is performing a bike trick\n",
"Getting embedding for The young man has glasses on his face.\n",
"Getting embedding for People waiting to get on a train or just getting off.\n",
"Getting embedding for a group of men and women converse\n",
"Getting embedding for More than one person on a bicycle is obeying traffic laws.\n",
"Getting embedding for The family is outside.\n",
"Getting embedding for The family is admiring the water\n",
"Getting embedding for The workers are standing still.\n",
"Getting embedding for A woman stand on a fountain and dips her toes in.\n",
"Getting embedding for People are standing on a grassy field\n",
"Getting embedding for Two adults walk across a street.\n",
"Getting embedding for a person in orange\n",
"Getting embedding for Lady wearing a yellow top is sitting on a chair\n",
"Getting embedding for people sit around table\n",
"Getting embedding for Men and women outside on a street corner.\n",
"Getting embedding for People wait on traffic.\n",
"Getting embedding for A child using a woodworking tool\n",
"Getting embedding for An older couple posing in front of a fountain for a picture\n",
"Getting embedding for A log is being painted by a child.\n",
"Getting embedding for In a bowling alley, a man holding a green bowling ball looks ahead at the pins that he must knock down.\n",
"Getting embedding for The cheerleaders are wearing clothes.\n",
"Getting embedding for Workers are on break.\n",
"Getting embedding for two men serving preparing food.\n",
"Getting embedding for Two people dancing, wearing dance costumes.\n",
"Getting embedding for Bikers stop in towns\n",
"Getting embedding for A man wearing a blue shirt screaming or yelling with his arms raised up in the air.\n",
"Getting embedding for Woman in white in foreground and a man slightly behind walking with a sign for John's Pizza and Gyro in the background.\n",
"Getting embedding for A saddle bronc rider gets lifted out of the saddle, but keeps his grip during his ride.\n",
"Getting embedding for the man is working on the computer\n",
"Getting embedding for Bicyclists waiting at an intersection.\n",
"Getting embedding for man ringing a bell\n",
"Getting embedding for A group of people are sitting around a table under a blue sunshade.\n",
"Getting embedding for Lady sits gazing at a camera.\n",
"Getting embedding for Two people are next to a fountain together.\n",
"Getting embedding for The girl is under the age of 88 years old.\n",
"Getting embedding for someone is playing an instrument\n",
"Getting embedding for Some women are reading.\n",
"Getting embedding for A woman wearing a green and pink dress is dancing with someone wearing a blue top with white pants.\n",
"Getting embedding for People are playing a sport.\n",
"Getting embedding for a man wearing a multicolored striped shirt playing the guitar on the street\n",
"Getting embedding for the dogs see each other\n",
"Getting embedding for a woman looking at her cellphone\n",
"Getting embedding for A woman in a black and orange jacket throws a stick for a brown and black dog to fetch.\n",
"Getting embedding for Two men are laughing and enjoying themselves.\n",
"Getting embedding for Several children are jumping rope in the middle of a road while other kids watch\n",
"Getting embedding for A skier in electric green on the edge of a ramp made of metal bars.\n",
"Getting embedding for A smiling lady in a green jacket at a public gathering.\n",
"Getting embedding for Two adults walking across a road\n",
"Getting embedding for The buildings are tall.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for A man is outside.\n",
"Getting embedding for Young lady dressed in black shorts and light blue shirt sitting outside at a public table looking at a picture on her camera with her left hand on her face.\n",
"Getting embedding for Kids are playing outdoors.\n",
"Getting embedding for A man is wearing blue.\n",
"Getting embedding for People wait on traffic.\n",
"Getting embedding for Bikers stop in towns\n",
"Getting embedding for The young man has glasses on his face.\n",
"Getting embedding for People are outside.\n",
"Getting embedding for There are some guys in this picture\n",
"Getting embedding for Three men looking at a car.\n",
"Getting embedding for The people are outdoors.\n",
"Getting embedding for A small girl dressed in a yellow dress with flowers on it bends over near a large pile of watermelons.\n",
"Getting embedding for A man is holding a girls hand and walking through a creek.\n",
"Getting embedding for Five people on stage performing and acting while girl lay's on belly.\n",
"Getting embedding for People in orange vests and blue pants with a yellow stripe at the bottom await the beginning of a race.\n",
"Getting embedding for a man wearing blue plays soccer.\n",
"Getting embedding for The people are doing yoga\n",
"Getting embedding for The boy is young.\n",
"Getting embedding for Two kids are playing with a big rock in the field\n",
"Getting embedding for Some children are playing jump rope.\n",
"Getting embedding for People are playing soccer.\n",
"Getting embedding for People walking around in a big city.\n",
"Getting embedding for The old man is painting a portrait.\n",
"Getting embedding for The boy does a skateboarding trick.\n",
"Getting embedding for The man is fertilizering his garden.\n",
"Getting embedding for The red team knocked the ball toward the goal and the black team tried to block it.\n",
"Getting embedding for A dog with an object in it's mouth is in the water.\n",
"Getting embedding for A soccer game where the team in yellow is attempting to advance past the team in white towards the goalie wearing a black top and blue shorts.\n",
"Getting embedding for People on bicycles waiting at an intersection.\n",
"Getting embedding for There are two woman in this picture.\n",
"Getting embedding for A woman in a red shirt is speaking at a table in a room where three other people are listening to her.\n",
"Getting embedding for The man has something to tell the woman.\n",
"Getting embedding for a man wearing blue plays soccer.\n",
"Getting embedding for A photographer takes a picture of the boy's parents by the fountain.\n",
"Getting embedding for A small girl dressed in a yellow dress with flowers on it bends over near a large pile of watermelons.\n",
"Getting embedding for The children are playing in a rocky field.\n",
"Getting embedding for One man writes on papers, while another man stands by.\n",
"Getting embedding for A man makes a ruckus.\n",
"Getting embedding for The people are moving.\n",
"Getting embedding for A man is sitting on a motorcycle.\n",
"Getting embedding for A man in a bright green shirt shows a woman in a bright pink shirt something on a clipboard.\n",
"Getting embedding for White small child wearing a brown and gray striped hoodie plays at park.\n",
"Getting embedding for Kids pose in front of a mountain background.\n",
"Getting embedding for The man is fertilizering his garden.\n",
"Getting embedding for There are some people in a street\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for A woman in a blue shirt is sitting at a table and looking at her cellphone.\n",
"Getting embedding for The boy does a skateboarding trick.\n",
"Getting embedding for Two men are barefoot on the lawn.\n",
"Getting embedding for Young people playing with a long jump rope in the street.\n",
"Getting embedding for There are some guys in this picture\n",
"Getting embedding for The child is painting.\n",
"Getting embedding for Two guys playing football on a campus green.\n",
"Getting embedding for The red and black team are playing a game.\n",
"Getting embedding for Girl is blowing to a butterfly.\n",
"Getting embedding for There is a soccer game.\n",
"Getting embedding for Workers are eating a meal while one man sits on a pile of plywood.\n",
"Getting embedding for A young boy with a blue coat makes a funny face as he walks towards the grass.\n",
"Getting embedding for A man wearing a gray cap is looking down.\n",
"Getting embedding for a woman and a german shepherd are pictured\n",
"Getting embedding for Brown dog treads through water.\n",
"Getting embedding for The cheerleaders are wearing clothes.\n",
"Getting embedding for Girls and boys are having fun outdoors\n",
"Getting embedding for Asian school children sitting on each others shoulders.\n",
"Getting embedding for Girls and boys are having fun outdoors\n",
"Getting embedding for A man is sitting on a motorcycle on the sidewalk.\n",
"Getting embedding for Two guys playing football on a campus green.\n",
"Getting embedding for The people stretched on yoga mats.\n",
"Getting embedding for People are fishing and walking next to the water.\n",
"Getting embedding for Two children are running down a sidewalk dressed in costumes.\n",
"Getting embedding for two men serving preparing food.\n",
"Getting embedding for A man dressed in blue shirt and shorts sits at a table while playing black in chess.\n",
"Getting embedding for pwople are on stage\n",
"Getting embedding for A person throwing something for her dog.\n",
"Getting embedding for Two people dancing, wearing dance costumes.\n",
"Getting embedding for Children playing a game.\n",
"Getting embedding for An older man stands on the sidewalk painting the view.\n",
"Getting embedding for A yellow uniformed skier is performing a trick across a railed object.\n",
"Getting embedding for The cheerleaders are wearing clothes.\n",
"Getting embedding for The toddler has milk around the corners of his mouth.\n",
"Getting embedding for Two women hug each other.\n",
"Getting embedding for The girl blows a butterfly.\n",
"Getting embedding for The woman in green and pink is dancing.\n",
"Getting embedding for a woman is talking\n",
"Getting embedding for Some firefighters check a vehicle.\n",
"Getting embedding for The shop sign says \"Welcome to Golden\"\n",
"Getting embedding for The man is outside sledding.\n",
"Getting embedding for Peole stand by a building supply truck.\n",
"Getting embedding for The cheerleaders are wearing clothes.\n",
"Getting embedding for The red and black team are playing a game.\n",
"Getting embedding for The silhouette of three people in front of a wall.\n",
"Getting embedding for A small boy has gotten into the cabinet and gotten flour and crisco all over himself.\n",
"Getting embedding for The man has something to tell the woman.\n",
"Getting embedding for A male is getting a drink of water.\n",
"Getting embedding for The red team knocked the ball toward the goal and the black team tried to block it.\n",
"Getting embedding for A farmer fertilizing his garden with manure with a horse and wagon.\n",
"Getting embedding for The old man is painting a portrait.\n",
"Getting embedding for A man with wild hair rocks a show playing a guitar center stage.\n",
"Getting embedding for People with bikes.\n",
"Getting embedding for Two women are talking while children are sitting on their laps.\n",
"Getting embedding for The toddler has milk around the corners of his mouth.\n",
"Getting embedding for People are walking outdoors.\n",
"Getting embedding for A lady with sunglasses on her head and a green sweatshirt is looking off-camera.\n",
"Getting embedding for Toddler with milk around his mouth.\n",
"Getting embedding for One soccer team is playing against another.\n",
"Getting embedding for Two women hug each other.\n",
"Getting embedding for There is a girl standing\n",
"Getting embedding for A man in a colorful shirt is playing an instrument.\n",
"Getting embedding for Two guys playing football on a campus green.\n",
"Getting embedding for A child was making a mess with milk.\n",
"Getting embedding for The man has something to tell the woman.\n",
"Getting embedding for People are playing a sport.\n",
"Getting embedding for A man shows a woman something.\n",
"Getting embedding for The woman is on a trolley.\n",
"Getting embedding for a woman on a yellow shirt is on the floor.\n",
"Getting embedding for the dogs see each other\n",
"Getting embedding for The brightly dressed skier slid down the race course.\n",
"Getting embedding for There are bicyclists stopped at a road.\n",
"Getting embedding for Two dogs biting another dog in a field.\n"
]
}
],
2022-03-11 02:08:53 +00:00
"source": [
"# establish a cache of embeddings to avoid recomputing\n",
"# cache is a dict of tuples (text, engine) -> embedding\n",
"try:\n",
" with open(embedding_cache_path, \"rb\") as f:\n",
" embedding_cache = pickle.load(f)\n",
"except FileNotFoundError:\n",
" precomputed_embedding_cache_path = \"https://cdn.openai.com/API/examples/data/snli_embedding_cache.pkl\"\n",
" embedding_cache = pd.read_pickle(precomputed_embedding_cache_path)\n",
2022-03-11 02:08:53 +00:00
"\n",
"\n",
"# this function will get embeddings from the cache and save them there afterward\n",
"def get_embedding_with_cache(\n",
" text: str,\n",
" engine: str = default_embedding_engine,\n",
" embedding_cache: dict = embedding_cache,\n",
" embedding_cache_path: str = embedding_cache_path,\n",
") -> list:\n",
" print(f\"Getting embedding for {text}\")\n",
2022-03-11 02:08:53 +00:00
" if (text, engine) not in embedding_cache.keys():\n",
" # if not in cache, call API to get embedding\n",
" embedding_cache[(text, engine)] = get_embedding(text, engine)\n",
" # save embeddings cache to disk after each update\n",
" with open(embedding_cache_path, \"wb\") as embedding_cache_file:\n",
" pickle.dump(embedding_cache, embedding_cache_file)\n",
" return embedding_cache[(text, engine)]\n",
"\n",
"\n",
"# create column of embeddings\n",
"for column in [\"text_1\", \"text_2\"]:\n",
" df[f\"{column}_embedding\"] = df[column].apply(get_embedding_with_cache)\n",
"\n",
"# create column of cosine similarity between embeddings\n",
"df[\"cosine_similarity\"] = df.apply(\n",
" lambda row: cosine_similarity(row[\"text_1_embedding\"], row[\"text_2_embedding\"]),\n",
" axis=1,\n",
")\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "4pwn608LpgkQ"
},
"source": [
"## 6. Plot distribution of cosine similarity\n",
"\n",
"Here we measure similarity of text using cosine similarity. In our experience, most distance functions (L1, L2, cosine similarity) all work about the same. Note that our embeddings are already normalized to length 1, so cosine similarity is equivalent to dot product.\n",
"\n",
"The graphs show how much the overlap there is between the distribution of cosine similarities for similar and dissimilar pairs. If there is a high amount of overlap, that means there are some dissimilar pairs with greater cosine similarity than some similar pairs.\n",
"\n",
"The accuracy I compute is the accuracy of a simple rule that predicts 'similar (1)' if the cosine similarity is above some threshold X and otherwise predicts 'dissimilar (0)'."
]
},
{
"cell_type": "code",
"execution_count": 8,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "SoeDF8vqpgkQ",
"outputId": "17db817e-1702-4089-c4e8-8ca32d294930"
},
"outputs": [
{
"data": {
"application/vnd.plotly.v1+json": {
"config": {
"plotlyServerURL": "https://plot.ly"
},
"data": [
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=1<br>dataset=train<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "1",
"marker": {
"color": "#636efa",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "1",
"offsetgroup": "1",
"orientation": "v",
"showlegend": true,
"type": "histogram",
"x": [
0.9267355922090726,
2022-03-11 02:08:53 +00:00
0.8959824210865295,
0.911972591898887,
2022-03-11 02:08:53 +00:00
0.854066984904447,
0.892887342538514,
0.9197115504102283,
0.86454296137645,
0.8314164148734599,
0.724331390174805,
0.8819971496348794,
0.7956215054013406,
0.7959481828066851,
0.8682525486487739,
0.8973704559214578,
0.8648042598103035,
0.9236698983952911,
0.9834804743408886,
2022-03-11 02:08:53 +00:00
0.8152447417624246,
0.82517200338841,
0.8138195591908199,
0.804188586062905,
0.9329690881323882,
0.9560346902836805,
0.9727875564710335,
0.8739787475357144,
0.8208200931608043,
0.7246913155327134,
0.9324916311845146,
0.8285737168086551,
0.8797008553699697,
2022-03-11 02:08:53 +00:00
0.8203332150276859,
0.9370111006544561,
0.8983827482700403,
0.8312111261703522,
0.8164052526562986,
0.89081486465724,
2022-03-11 02:08:53 +00:00
0.7466264016350165,
0.7496519642865328,
0.8737558267185661,
0.7849398152833806,
0.8309506411877995,
0.930721217634791,
0.8281747402318884,
0.9529528463964135,
0.78286620810114,
2022-03-11 02:08:53 +00:00
0.8871009561039284,
0.9000278355775503,
0.8805448754876422,
0.9303377269239715,
0.880195490304124,
0.8529206894100387,
0.9467797365089127,
2022-03-11 02:08:53 +00:00
0.9503676908767298,
0.7030531845036039,
0.8643992383828719,
0.8536886653620115,
0.9619331110018076,
0.9798279216368141,
0.8545739734233097,
0.8957115038209394,
0.8241137164789778,
0.8234984829866299,
0.8936706242503488,
0.8987178415114151,
0.9081806523258728,
0.9208852069309506,
0.8961858080568302,
0.8831329492644463,
0.9282623086728464,
0.8990849222879878,
0.8284548404976377,
0.8202091320216596,
0.8647762708043815,
0.8401369579324562,
0.9887387560741359,
0.8333426560846096,
0.8285331108196707,
0.9118814662694842,
0.8706628935716073,
0.9279786047447278,
0.7389559884393851,
2022-03-11 02:08:53 +00:00
0.8433932042319168,
0.9240307531537069,
0.9507699373739879,
0.8586024439929903,
0.8685107123188051,
0.8755350362634888,
0.9894909158977805,
0.8279650076349706,
2022-03-11 02:08:53 +00:00
0.9108703736855251,
0.9090161902538356,
0.8603952587890591,
0.7791958177087142,
0.8800175081701747,
0.8442387838852023,
0.7672266109003523,
0.9379753909220483,
0.8637536217766965,
0.9190295692184896,
0.8137487445889441,
0.913488637289173,
0.8043760077432088,
0.879230049130692,
0.8716796299186113,
0.8669146720822511,
0.7736224662910375,
0.9439048564746342,
0.905686329549249,
0.9534823417127044,
0.9150626364280348,
0.9409873575925382,
2022-03-11 02:08:53 +00:00
0.8111212514948384,
0.9171209894364517,
0.9126582215678652,
0.8337042978731589,
0.7317859049265332,
0.8444929456896246,
0.8561920423978694,
0.7765276737312753,
0.8526116780064548,
2022-03-11 02:08:53 +00:00
0.9178549175037264,
0.9238337663325366,
0.7218806787029511,
0.8180162425905607,
0.9687438996846139,
0.8354559170776137,
0.9146160669265362,
0.808210346566899,
0.9563444959106976,
0.9066029888020705,
0.8485102489128452,
0.8154210964137394,
0.8862154929899421,
0.9280705424664027,
0.9835182438283631,
0.9653797794869796,
0.7815047664005954,
0.7156150759652161,
0.9256075945052357,
0.8135073899611842,
0.9655015183317774,
0.8222681606077051,
0.9072121875352692,
0.8611990749314697,
0.9075083706276883,
0.9452697088507865,
0.8792221642490844,
0.9261547231888615,
0.8628843091591882,
0.7825774678762871,
0.8265878682590281,
0.739969794812712,
0.7855475190052562,
0.9111614048749492,
0.8871057099019782,
0.8824403180046967,
0.8618250241358769,
0.9787037899591454,
0.8066230593744526,
2022-03-11 02:08:53 +00:00
0.8276910929485197,
0.9246432065475546,
0.8840853147036714,
0.7864843506607968,
0.9106863309457174,
0.9342800777751461,
2022-03-11 02:08:53 +00:00
0.8573335076933394,
0.7780871068368611,
0.7913314671023687,
0.8574377397654754,
2022-03-11 02:08:53 +00:00
0.9078366094992862,
0.752973927739965,
0.8630106574340903,
0.9051526765387132,
0.7715467460924421,
0.8941465881092564,
0.8095881925341774,
0.7733578297403775,
0.7600408383615723,
0.7819972023010567,
2022-03-11 02:08:53 +00:00
0.9003461723046663,
0.742804802462531,
0.8645936494952892,
0.8158769876746998,
0.8338827591801034,
0.8272653957842918,
0.9017517383025067,
0.8480852031381642,
0.7970818327030217,
0.8483706700151505,
0.9272909957177218,
0.9511439768573109,
0.8796630928594475,
2022-03-11 02:08:53 +00:00
0.8297595345126891,
0.8132311692835352,
0.8460965104145681,
0.8787645382723887,
0.8591367321478075,
0.8452813271088438,
0.7081208529169517,
0.8769677227983257,
0.9576216492651992,
0.7463356296909661,
0.8618039394725079,
0.9560112448844987,
0.8478374741588728,
0.769289016610608,
0.8458585917175788,
0.9014601942019844,
0.8816990618751593,
0.8836365020988086,
0.8078009752591794,
0.8984716696273352,
0.9064470720437559,
0.8762712604989469,
0.9178852324400089,
0.7896235961898858,
0.8939345730555539,
0.9534018416101309,
0.8358942065066962,
0.948865711109057,
0.9046799884368947,
0.7583576539746958,
0.9080459944470666,
0.7709722699637687,
0.963551247793185,
0.9792712669973792,
0.8526700752964347,
0.827813310501214,
0.9735858612930184,
0.7212301964264753,
0.8257425306850711,
0.924320548123444,
0.9183796450934556,
0.9029146930594939,
0.9410246041287362,
0.9609604037240548,
0.7467407977088399,
0.8831901227140917,
0.8173287201360423,
2022-03-11 02:08:53 +00:00
0.8067347035873811,
0.7921957440752069,
0.9110994798640996,
0.8678737504816454,
0.91177432256281,
2022-03-11 02:08:53 +00:00
0.7812564975232954,
0.8553931177741548,
0.8798565771781157,
0.8485358177151634,
0.7748765500469469,
0.9432062978626803,
0.8328320715664294,
0.798362976362054,
0.9345589971516312,
0.7800346997026738,
0.9894680324717378,
0.8239308908293631,
0.8236003487600682,
2022-03-11 02:08:53 +00:00
0.8346101071823683,
0.8273793498951607,
0.7872103659197973,
0.9502897886350955,
0.8330663046259037,
0.934656824021464,
0.8082083574312163,
0.8920672691284423,
0.8566523142422968,
0.7636170839305908,
0.8271048233812095,
0.8450776680779332,
0.9045266242453643,
0.8578964048993004,
0.8673866120865574,
0.8804224183254911,
0.8199459541564516,
0.9324100333449752,
0.9096821257786284,
0.8658255623901577,
0.9386720382389069,
0.8517830108211426,
0.8894337360140997,
0.9788475938303791,
0.8369738176471242,
0.8438616298356066,
0.9457050131096572,
0.8699723457920832,
0.7795221422725261,
2022-03-11 02:08:53 +00:00
0.9136284838226408,
0.8394610380643428,
0.9453279812604809,
0.7899532079576974,
2022-03-11 02:08:53 +00:00
0.9078373592832483,
0.8434980565725266,
0.8112068695892253,
0.9466506417952321,
0.931413666521914,
0.7932453739077451,
0.8205411410996694,
0.9243834389749737,
0.7196162090749076,
0.7552985097607482,
0.9593440980269001,
0.9175579371411101,
0.8643861904380715,
0.8315201131392358,
0.7608819740667967,
0.9704324556248521,
0.8037085296495649,
0.7785353984256803,
0.8044961880185003,
0.8313307508528462,
0.8064106355318161,
0.9291149178587121,
0.8412940943665776,
0.6917091092254815,
0.8952044326369335,
0.818225072265956,
0.8645847235619342,
0.8532020278604288,
0.8143634599177915,
0.8829012215420231,
0.7764652540281851,
0.8500993692007114,
0.8616919094128496,
0.9257293988684876,
0.935772204981356,
0.774265719975256,
0.789871006952492,
0.8590438495949997,
0.9317809675958327,
0.9087109945316316,
0.9492979985891563,
0.8813316522495983,
0.737208140494784,
0.8838176414418067
2022-03-11 02:08:53 +00:00
],
"xaxis": "x2",
"yaxis": "y2"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=1<br>dataset=test<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "1",
"marker": {
"color": "#636efa",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "1",
"offsetgroup": "1",
"orientation": "v",
"showlegend": false,
"type": "histogram",
"x": [
0.9424796846788046,
0.9078956616062651,
0.8334324869405139,
0.9352180100721489,
0.9055462990278683,
0.8981939713362292,
0.8310153265298836,
2022-03-11 02:08:53 +00:00
0.8504676065056102,
0.8456281890127811,
0.8845204605513738,
0.9575409744952922,
0.8867362111321382,
0.8268148049027775,
0.9197424492086052,
2022-03-11 02:08:53 +00:00
0.7868932882211557,
0.7584994078201337,
0.9184151112777117,
0.8634069824306613,
2022-03-11 02:08:53 +00:00
0.8347803692078435,
0.8293627321978324,
0.9290633376090963,
0.8385821685601387,
0.9389267225654604,
0.8908184420511278,
0.8663476047908254,
0.8406483287589527,
0.8084243400296846,
0.8909500802168062,
0.9262896014538773,
0.8955541227032415,
0.8055268516127605,
0.7586268193375352,
0.9609058493434491,
0.9149590584369259,
0.8670137150023248,
0.8813831596952219,
0.860225515397,
0.9239960993694921,
0.9173221779567197,
0.8037285375166193,
0.9196033586084531,
0.8179495005725935,
0.9015423000521007,
0.9054394611244669,
0.9309412938014421,
0.9421722896767072,
0.7632823193304991,
0.8622055681944147,
0.9855273112832761,
0.914415556703985,
0.9160573926361296,
2022-03-11 02:08:53 +00:00
0.8027504541651594,
0.7131090046615766,
0.8617419486109846,
0.98287317120162,
0.8100227524488052,
0.892387860418092,
0.809664342563128,
0.8707613725090536,
0.8786740135792194,
0.827463989695164,
0.8927098766437765,
0.9565597072685753,
0.9060728094488207,
0.7383075176406174,
2022-03-11 02:08:53 +00:00
0.9645943656943117,
0.8755564011198428,
0.879644342835206,
0.8679709662655806,
0.9304235140233539,
0.8902804954107686,
0.874836956726809,
1,
0.7979398160311217,
0.8182553476855959,
0.7782108664889419,
0.8427610541278799,
0.8696408841463731,
0.8747903021226509,
0.9149733683476413,
0.9651568967676807,
0.977554798666313,
0.8964005890099545,
0.8689760342800351,
0.8501707280841363,
0.9069421093108844,
0.7682621581806748,
0.9658683145893564,
0.8946443490839046,
0.7855154288057422,
0.8963791538152951,
0.8062904923128396,
0.8165205974456892,
0.8392522239745428,
0.9456080865553905,
0.7904904118155052,
0.8331267917887729,
0.7852156048607353,
0.7859162372602091,
0.90976749903987,
0.8868692158735381,
0.9391826646888828,
0.9428151203411792,
0.7923603881082193,
0.9018727187087263,
0.97231619441654,
0.7820369106687125,
0.9667234198836612,
0.9787696268534193,
0.9155729796430734,
0.8273013981821028,
0.960331962375041,
0.929897501248699,
0.8775117472056205,
0.8613342799390303,
0.9144155658413454,
0.7783710778245275,
0.9701880187837707,
0.7858944695167878,
0.9278353488412265,
0.9472367442821338,
0.7834809783164823,
0.7997358970000906,
0.8459052928211823,
2022-03-11 02:08:53 +00:00
0.8612077001477506,
0.8470901718545574,
0.8240372721865142,
0.8656086505509303,
0.8023193245375629,
0.783678884717712,
0.8804041342871782,
0.8491559248265502,
2022-03-11 02:08:53 +00:00
0.7883452708992278,
0.9461393747874567,
0.8351233852567399,
0.8158174033362672,
0.8604581312681885,
0.9623616564166072,
0.856468839580938,
2022-03-11 02:08:53 +00:00
0.8576867667576002,
0.8973905359734362,
0.8634447086393151,
0.8149528594157183,
0.8731712539786042,
0.8653347693348777,
0.9295255577503568,
0.8358267202312724,
0.9718886825986638,
0.8500189244661982,
0.6201715853032974,
2022-03-11 02:08:53 +00:00
0.8982737441192186,
0.8919523976747616,
0.7327218610615461,
0.8329671226232828,
0.9265589852995393,
0.8976605728389208,
0.8865148834725959,
0.7893917266176482,
0.7303107669745307,
0.8428958494374836,
0.8712646527997077,
0.9726111204993027,
0.9368020235357589,
0.9270010845221283,
0.8900608737222808,
0.79751731467271,
0.940330874442756,
0.8484005154341017,
0.9285585486502653,
0.8461714648336822,
0.9301612560985565,
0.9840391345414705,
0.8305503022437543,
0.8985536904301074,
0.9477072571711664,
0.934289266722412,
0.8849523260221185,
0.773662084263725,
0.8083290895710892,
0.9510007702344464,
2022-03-11 02:08:53 +00:00
0.8677438099387293,
0.8324233959729913,
0.7379868665757632,
2022-03-11 02:08:53 +00:00
0.9049462203262157,
0.9044068971508709,
0.7810399091823383,
0.9041769944901107,
0.7720832575605646,
0.7168259247291856,
0.8657076247663684,
0.9689982289113886,
0.9330371342125484,
0.7014093148352947,
0.9056081834465988,
0.8483474406338491,
0.8729108893579319,
0.8494252832990817,
0.8702668024360607,
0.8703072657352607,
0.9279473627134431,
0.8615930019969985,
2022-03-11 02:08:53 +00:00
0.7590822858582416,
0.8435232133017242,
2022-03-11 02:08:53 +00:00
0.8264379729550373,
0.8793126203874563,
0.8474523011181411,
0.7546334362798065,
0.8870818558635253,
0.8349553719953364,
0.923200758907938,
0.7924421886376952,
0.855610314876051,
0.8397958722387048,
0.9358165871780313,
0.9045773532651927,
0.9022537126477369,
0.7756039171534931,
0.9460916193165211,
0.8264119474819362,
0.8261258110555288,
0.8605336601635148,
0.7518422502719879,
0.8495875568327971,
0.992279957461567,
0.7499254098383082,
0.8845204605513738,
0.8361936554147797,
0.9172228811270781,
0.8068135569680097,
0.7957399297673027,
0.8632611459497657,
0.7612462572836113,
0.958912542207282,
0.9555759038520236,
0.8822980111141415,
0.9663740138580926,
0.9071760951682218,
0.933533889331542,
0.8042262160076494,
0.9399607299036465,
0.8318513717574904,
0.8697471261915183,
0.9103391823944785,
0.8272582058280911,
0.7868989551985196,
0.741616891032038,
0.8828593526738941,
0.9141342991713857,
0.7259887482535182,
0.9478299712074272,
0.8437665184157634,
0.9198304263214642,
0.9069062939546915,
0.9036466179892355,
0.9817542892477462,
0.8833292620163823,
0.8325566159927532,
0.8135910430676571,
0.9628932976448151,
2022-03-11 02:08:53 +00:00
0.9450804651757593,
0.9226384097207587,
2022-03-11 02:08:53 +00:00
0.8401818092769459,
0.7236914068799891,
0.6828741129809796,
0.8344105231696747,
0.9959256404068638,
0.9528703966342777,
0.9695146929637602,
0.9220387803870667,
0.9511950111612875,
0.8744220297098892,
0.8399026052955197,
0.9029483760093544,
0.9097073428234548,
0.8651925589034414,
0.9178332688200683,
0.7556713750040486,
0.8601740878617401,
0.8250804248322693,
0.7994733073162199,
0.8911389632926229,
0.9159137771752827,
0.7867422038306616,
0.8035375125861887,
0.7702882646822419,
0.9060460436592801,
0.7214029227404364,
0.8607904816523709,
0.8228468627026362,
0.8900020170242702,
0.9343567733995704,
2022-03-11 02:08:53 +00:00
0.9305049273825277,
0.9664193138851035,
0.9008537856737299,
0.7625840736444573,
0.8153020546259354,
0.9215061720116507,
0.7192673780176765,
0.8949994062319516,
0.936756654753208,
2022-03-11 02:08:53 +00:00
0.7602684168515255,
0.8184439768344212,
0.8361983865246644,
0.7761725471031594,
0.7724780963721255,
0.9249211342441499,
0.8718843142394451,
0.8522890338443532,
0.9015475856777736,
0.8720699804712655,
0.8937599375974886,
0.8721713576430158,
2022-03-11 02:08:53 +00:00
0.8100783165392635,
1,
0.8213222547688209,
0.8361185411078411,
0.8371907462164929,
0.9065697379059939,
0.7522406715066838,
0.828307889290731,
0.8499886821303806,
2022-03-11 02:08:53 +00:00
0.9097932363997518,
0.9529813102433097,
2022-03-11 02:08:53 +00:00
0.8449289750216329,
1,
0.8302949354181002,
0.7741532048489975,
0.8743828041850981,
0.8201855611976102,
0.8194689754558628,
0.792507679596051,
0.8748126109754423,
0.8299510305152616,
0.9619426556959261,
0.8627070028560689
2022-03-11 02:08:53 +00:00
],
"xaxis": "x",
"yaxis": "y"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=-1<br>dataset=train<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "-1",
"marker": {
"color": "#EF553B",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "-1",
"offsetgroup": "-1",
"orientation": "v",
"showlegend": true,
"type": "histogram",
"x": [
0.6957767388290562,
0.7579420785485997,
0.6956346694277136,
0.7076445232623223,
0.7807306734457936,
0.7139655931983309,
0.7482502662331184,
0.6369069500843659,
0.6658508493766556,
0.6504235710140688,
0.7192157983049555,
0.8018422916166743,
0.7177609439190309,
0.7101045697103688,
0.6571780241576142,
0.7680272436762176,
0.7234850964131593,
0.7152898708694893,
0.9501355018154564,
0.6962127827942286,
0.7684203207980727,
0.6855948971499829,
0.765605006674216,
0.7443232402476876,
0.7041761758241216,
0.8326497880219594,
0.7441778029675181,
0.7008454983313879,
0.7537693597675987,
0.7604977355277929,
0.6549109506960609,
0.7436588048190768,
0.736629241812364,
0.7186606117959837,
0.743294708716713,
0.7984912579429615,
0.8887305476049442,
0.6889141161032268,
0.7456194127120711,
0.6994897408446203,
0.7583012514203751,
0.7085664340222914,
0.6923652600780655,
0.7408748794305987,
0.711424171061503,
0.6311728944545804,
0.6777055537053459,
0.7255800325087469,
0.648567491821295,
0.6743371742523807,
0.8018549612389473,
0.7894137282256273,
0.7128392177932913,
0.7188183914165817,
0.7704057820977789,
0.7196836946749967,
0.7360703150646857,
0.6815385098793996,
0.6487310877592406,
0.697329063568009,
0.6597277478579927,
0.8184293996585623,
0.7593483031078324,
0.6532305277723722,
0.7114445030553102,
0.7014456446951195,
0.7153770310598158,
0.7888810617640961,
0.8567450102581987,
0.7352496863055631,
0.7409637400944985,
0.7436246856280054,
0.6776765762765335,
0.7574849242619103,
0.7781943930539938,
0.6705833254898479,
0.6996804955392666,
0.708413777144502,
0.6208481877780442,
0.7631377712831763,
0.7236950904524616,
0.6930659760723458,
0.8197788002483368,
0.7183749308247108,
0.708279940474656,
0.6716302691422836,
0.7227227641784396,
0.7176967428668025,
0.6344405133500717,
0.7347697535407555,
0.6137191056434405,
0.704325807968194,
0.682901324910752,
0.740582042072576,
0.8504604982454053,
0.6576669009983956,
0.7377992639136004,
0.6768728894788933,
0.7390305317492698,
0.6779588884193068,
0.7010958025289625,
0.6808620853852188,
0.7342685181593549,
0.7247058033450888,
0.6666900634685694,
0.7249360881820055,
0.6821873098905429,
0.8266266825008808,
0.7802529245761647,
0.74664178598042,
0.7353986890974347,
0.7470097879184785,
0.6901197315623567,
0.7382066089989482,
0.6589753496564349,
0.8008982879808738,
0.7168379312540042,
0.7521414198277515,
0.6953320433269816,
0.8073035301459459,
0.8027947050396425,
0.7043228507137288,
0.7231539855105249,
0.7383263907454282,
0.7576679699471423,
0.663538429030986,
0.6595808033434192,
0.7882393615703649,
0.7930397325325561,
0.7350673331350056,
0.7353002557889453,
0.5974027090802823,
0.71498635419546,
0.7622099782316335,
0.7391795687415161,
0.6675464131038639,
0.7154656258748834,
0.7437529076201842,
0.6211596349938211,
0.7188358964678377,
0.849937090645506,
0.7095529861328113,
0.7406039820475517,
0.8252001843610418,
0.6600510705360498,
0.8210357263546099,
0.6929119099897709,
0.7213879617650821,
0.75916409837913,
0.7427633178667192,
0.7552302281942865,
0.7063101613043915,
0.750297544678819,
0.7543911574920913,
0.6966625389622579,
0.7144451621430684,
0.6869972931308058,
0.677188890003343,
0.9156123844751002,
0.6553838580181919,
0.7216969692662536,
0.7057076998477817,
0.7288073281835964,
0.6725923362897323,
0.7143254559469274,
0.7518074207615209,
0.8335512685134062,
0.7759626794704192,
0.8856844683478049,
0.7041490384164095,
0.7165191930620582,
0.7135550060411538,
0.7912155955788918,
0.7551034045584042,
0.6986803166472683,
0.754824217875114,
0.7302660247050152,
0.7292630043525107,
0.571656914618242,
0.6698699469374744,
0.7984652233767994,
0.7727344176188279,
0.8009444873466395,
0.7941470756172919,
0.7652444284493098,
0.6741336809466748,
0.7539180675581941,
0.8697425458622727,
0.6918630406009729,
0.7489767304058851,
0.736384242889192,
0.7813447997158885,
0.7171120058842186,
0.7750322398349039,
0.8005281011285704,
0.7211245778376837,
0.7673632996245495,
0.7783697594140624,
0.7266884553674846,
0.6756399302317858,
0.6556438756089052,
0.7128589961833791,
0.7581993529758215,
0.6609844666486506,
0.7097778453330825,
0.7400669997825543,
0.8194161734102894,
0.6364360529422222,
0.7037773962740846,
0.8696228178354303,
0.7345582917241741,
0.643112997052813,
0.7610320975809615,
0.753687475759657,
0.7468750326384852,
0.7035578981973345,
0.7310425504599337,
0.7897991836049737,
0.7245144286237929,
0.6316741098779987,
0.7012328824900284,
0.5900419819184002,
0.8182166949314503,
0.7228756964171011,
0.6440790377007117,
0.7668502479739949,
0.7994625786967952,
0.6916093734832223,
0.654390476398534,
0.7686898144257487,
0.6858388736601835,
0.7414441464634663,
0.7167581644002714,
0.7534962658828813,
0.7785786109741161,
0.7709302585561065,
0.8217756641797949,
0.7467392332137058,
0.81777684562388,
0.7473614292639223,
0.7611878839275079,
0.7117761650105634,
0.7085496761694915,
0.7124153888060497,
0.7010151710136044,
0.7869222008593535,
0.711698947348409,
0.7471849590974714,
0.7444455266265991,
0.7109919785851376,
0.6761384312121954,
0.7069524601995654,
0.6769302705298076,
0.6600007718102588,
0.7134572349000202,
0.6776847453414893,
0.7701252174850034,
0.7619024930673666,
0.7350124211266063,
0.7282874850408689,
0.7150121553663806,
0.7686694919390304,
0.710804915025901,
0.754091001033791,
0.659766025750913,
0.7343426894879505,
0.6882973972778424,
0.7017239067139008,
0.6961212586520296,
0.68280329952847,
0.7474987582590388,
0.7024630975520628,
0.6982331978086047,
0.6592616699060925,
0.6607786597175946,
0.6330185581995625,
0.6492017860797994,
0.7432473772469652,
0.6808312235563921,
0.7015910481109136,
0.7295204842133106,
0.727575624013537,
0.6290966993880249,
0.8097507802881175,
0.6950500355035831,
0.7610194730389668,
0.7653203010902764,
0.7930178316838892,
0.734057647575713,
0.6882673397419932,
0.7068002726242354,
0.793009415598764,
0.643454861462673,
0.676368701856204,
0.7430550382270783,
0.7719090768075099,
0.6580654026532401,
0.7428410268158974,
0.7592748841815049,
0.7695088759512342,
0.702181066664115,
0.7711052428118305,
0.7434207560001076,
0.7404961692611163,
0.7318114780227037,
0.7461892166750458,
0.7733820478206747,
0.6616145428467011,
0.7695717007796302,
0.7980215897653987,
0.747678639463431,
0.7509209450577673,
0.6585229935121563,
0.737340279891889,
0.7299769809214144,
0.6643481197819185,
0.7141036136948552,
0.7186136947767433,
0.7683181698119435,
0.7266528742664866,
0.7207220525925799,
0.7262594771454401,
0.699484819588429,
0.7087768808001038,
0.7031346970669167,
0.7759793525469474,
0.6645736829343425,
0.7238090745888058,
0.756500387168653,
0.7212253755892469,
0.9439048564746342,
0.7170936399576909,
0.6684835207627098,
0.7427154430013124
2022-03-11 02:08:53 +00:00
],
"xaxis": "x2",
"yaxis": "y2"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=-1<br>dataset=test<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "-1",
"marker": {
"color": "#EF553B",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "-1",
"offsetgroup": "-1",
"orientation": "v",
"showlegend": false,
"type": "histogram",
"x": [
0.6512761063171582,
0.7287342850883989,
0.7577025876749072,
0.7592639273035047,
0.6818876474384771,
0.7152573447252137,
0.774350664396305,
0.6550472488733596,
0.7738981884615284,
0.7541243234162924,
0.7519536763883761,
0.8320210261974733,
0.7426518936353101,
0.7265979080155922,
0.890966113872019,
0.7353818553266447,
0.7634921579881796,
0.8294904864250581,
0.8137172697009756,
0.8420971446336485,
0.6893957895813114,
0.7413429532184015,
0.7582374198607541,
0.7828280012313105,
0.8391627159272224,
0.7217988885724745,
0.7162480084188481,
0.6981259451671704,
0.6588040830433353,
0.7549114753010229,
0.7674729370748693,
0.7677466078268333,
0.7813055362210076,
0.825251692431862,
0.6297959602274461,
0.7641061587072203,
0.7753469951347335,
0.674860172370046,
0.7938148378683839,
0.8198431171016068,
0.7595090110974859,
0.7507352656520713,
0.721289644501111,
0.741448088675425,
0.7510460463679841,
0.6800620536804655,
0.7915508451351532,
0.7535140620444722,
0.697352781045208,
0.6921955653816687,
0.7381201713883649,
0.7590104632379403,
0.7947524562852025,
0.7597505011395934,
0.7777522657386521,
0.7939587696382155,
0.7070668790582435,
0.840089831842377,
0.6880689281216312,
0.7677769805308678,
0.7233632378806841,
0.7961527561562991,
0.6980444937348311,
0.75390648995092,
0.6749693686864726,
0.7568808238483822,
0.7172913226413175,
0.7473527225327126,
0.8457406370143061,
0.8437228274714977,
0.6958621694379827,
0.7765377372889585,
0.6973343891702213,
0.7756899803088712,
0.6635907810675821,
0.6488560596672128,
0.7298030259010234,
0.6846443367035591,
0.7001117775348583,
0.6966337253874092,
0.7232555055076415,
0.757474542932862,
0.6721150769416768,
0.7386670808176127,
0.6698897461312842,
0.7604099701068538,
0.6751748245607633,
0.6911357942165284,
0.7475988151834796,
0.7250925137024565,
0.7682822086780725,
0.6001504764928414,
0.7286482604645207,
0.7850979345046225,
0.7489292588984857,
0.7863999187035979,
0.7441989398358548,
0.6878769037784354,
0.7087455880321216,
0.7244528847914316,
0.7574691637677696,
0.776178673818935,
0.6687352373386877,
0.6726942377442499,
0.8771278312509985,
0.7276308021233041,
0.7367105247576413,
0.724957581832504,
0.734476946987388,
0.7037027676285385,
0.7196719298703599,
0.701205421305578,
0.7456378459685954,
0.7447862922562046,
0.7559499820995036,
0.7413402255446677,
0.7794929969859868,
0.6717172956350257,
0.7260762653292592,
0.753871022836222,
0.6352735205755001,
0.702039186060966,
0.7170572347148828,
0.7691696980107645,
0.728628644727037,
0.7968335420664091,
0.725401078187473,
0.7558589479031697,
0.6981586858386896,
0.7279514633434744,
0.8014665326776104,
0.7785624955253924,
0.8220831227737969,
0.6529629264385312,
0.8110829165536457,
0.7278873922826562,
0.7178469882442876,
0.7530896449767179,
0.772948903859944,
0.7575907169850528,
0.7082766658229158,
0.7226745075155961,
0.7018628141104125,
0.7235102040468365,
0.7564796924870556,
0.6510823792096195,
0.7744862914197582,
0.7086084098502505,
0.7296300991380639,
0.7173922813079803,
0.7516377265688725,
0.7508222794968331,
0.7338108329171547,
0.7786666590377729,
0.7828127092439076,
0.7114189502591047,
0.6482639628070281,
0.6457180925320146,
0.7522849702874722,
0.7676087704573854,
0.7021592330299424,
0.7501184007139824,
0.7127376229055441,
0.6135183363691984,
0.7353889483569046,
0.7512335236262094,
0.6589844377556087,
0.8108130845116872,
0.7484309312232497,
0.7157749767777302,
0.6755528085360091,
0.7067158772016874,
0.7682838787038063,
0.694274162009842,
0.7332763277935249,
0.7339484249124498,
0.7540065545888932,
0.7051222601331073,
0.6690000295656567,
0.6574876329908412,
0.763910578747556,
0.7439229694926585,
0.7327668331541545,
0.7218746066334083,
0.7173197798553519,
0.6233098479185598,
0.7265117625117632,
0.7717410161749992,
0.6854051180600711,
0.7647126507746091,
0.6792154479509659,
0.7081674789700858,
0.8082148565794375,
0.6911247759297874,
0.6437947506940874,
0.6959960643233447,
0.6751843420109633,
0.8249849915597032,
0.6037653436571525,
0.7652836594437318,
0.7977485667462958,
0.7534843359114771,
0.7199205406491861,
0.7865275101159649,
0.6417750850147381,
0.7199948511420241,
0.7352368906770536,
0.7534040762347791,
0.7381242671779171,
0.6815418931625805,
0.7633178120388293,
0.821504995270443,
0.7527585939232398,
0.7282115955272953,
0.6643874541733705,
0.7318894661339506,
0.6659948953312396,
0.8150879834179252,
0.6910386690814135,
0.6301591309232464,
0.7939423838462603,
0.628872152793837,
0.7657494007517387,
0.7343479723046831,
0.694662170867962,
0.7348332712061992,
0.7106706446249165,
0.7058771007273967,
0.7747826709165979,
0.7392215059198428,
0.67774434282123,
0.7831071588877654,
0.7554431084313338,
0.7040343241736384,
0.7001366857748546,
0.6847753231240155,
0.6959848301196746,
0.7813554656619699,
0.7150246292072198,
0.6633387291516334,
0.715043596855753,
0.6950135117043482,
0.7069528542592999,
0.6784499436539065,
0.7688234340496489,
0.7241563565971186,
0.7166312309451826,
0.7420523032504507,
0.706267239202893,
0.6442057457506499,
0.6182945197725017,
0.738526489411579,
0.7435080221752528,
0.6599631270531486,
0.669158474503511,
0.7084782747637822,
0.7682136631600044,
0.692764183789182,
0.8343481414223372,
0.6465626343994355,
0.7519945566830063,
0.6901944784591544,
0.6822869701777718,
0.735620343723582,
0.7434270528066679,
0.7530717871104928,
0.7089706943819438,
0.6528323666858076,
0.7703690530748709,
0.7063417690564554,
0.659407045000513,
0.6748263472240738,
0.7539619264384773,
0.7329013588219363,
0.7278093719576061,
0.6798215965900886,
0.7321235458031602,
0.7623326472544149,
0.7106304447454121,
0.7298353283791518,
0.8280081177680023,
0.8487875303968083,
0.7586902498993008,
0.7262160395789158,
0.7571717574914204,
0.7170194298438589,
0.7135185051713953,
0.7651642045562742,
0.8141800133606178,
0.744641171259305,
0.7047019700477195,
0.7210806601313137,
0.7938074864499073,
0.7314762854007714,
0.6527824764542026,
0.7331345189381514,
0.7035321947901184,
0.7512717385292428,
0.6666258593315276,
0.708048262930314,
0.6874076655010792,
0.6668679623890902,
0.6958574562606905,
0.7502669227156041,
0.6436955328817746,
0.6715456271437414,
0.7528740905989604,
0.7462720387872654,
0.7454392784033899,
0.6555255764141893,
0.6879756182494516,
0.7555951795428152,
0.6682754679285435,
0.7466269192055776,
0.6378575070494119,
0.7708276995608967,
0.7218637002364144,
0.8453927000677727,
0.7638858871897566,
0.712437737726452,
0.6873683133158265,
0.7074891731082739,
0.7216305393684207,
0.7618463147488342,
0.7494764318918425,
0.759075131167549,
0.7038415933653567,
0.7075232375166159,
0.7874649158888938,
0.7313010819863205,
0.7350274389477689,
0.7338570735840917,
0.7480964597537426,
0.7399861283611244,
0.6911968351977298
2022-03-11 02:08:53 +00:00
],
"xaxis": "x",
"yaxis": "y"
}
],
"layout": {
"annotations": [
{
"font": {},
"showarrow": false,
"text": "dataset=test",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.2425,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "dataset=train",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.7575000000000001,
"yanchor": "middle",
"yref": "paper"
}
],
"barmode": "overlay",
"legend": {
"title": {
"text": "label"
},
"tracegroupgap": 0
},
"margin": {
"t": 60
},
"template": {
"data": {
"bar": [
{
"error_x": {
"color": "#2a3f5f"
},
"error_y": {
"color": "#2a3f5f"
},
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "bar"
}
],
"barpolar": [
{
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "barpolar"
}
],
"carpet": [
{
"aaxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"baxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"type": "carpet"
}
],
"choropleth": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "choropleth"
}
],
"contour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "contour"
}
],
"contourcarpet": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "contourcarpet"
}
],
"heatmap": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmap"
}
],
"heatmapgl": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmapgl"
}
],
"histogram": [
{
"marker": {
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "histogram"
}
],
"histogram2d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2d"
}
],
"histogram2dcontour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2dcontour"
}
],
"mesh3d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "mesh3d"
}
],
"parcoords": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "parcoords"
}
],
"pie": [
{
"automargin": true,
"type": "pie"
}
],
"scatter": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter"
}
],
"scatter3d": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter3d"
}
],
"scattercarpet": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattercarpet"
}
],
"scattergeo": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergeo"
}
],
"scattergl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergl"
}
],
"scattermapbox": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattermapbox"
}
],
"scatterpolar": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolar"
}
],
"scatterpolargl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolargl"
}
],
"scatterternary": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterternary"
}
],
"surface": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "surface"
}
],
"table": [
{
"cells": {
"fill": {
"color": "#EBF0F8"
},
"line": {
"color": "white"
}
},
"header": {
"fill": {
"color": "#C8D4E3"
},
"line": {
"color": "white"
}
},
"type": "table"
}
]
},
"layout": {
"annotationdefaults": {
"arrowcolor": "#2a3f5f",
"arrowhead": 0,
"arrowwidth": 1
},
"autotypenumbers": "strict",
"coloraxis": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"colorscale": {
"diverging": [
[
0,
"#8e0152"
],
[
0.1,
"#c51b7d"
],
[
0.2,
"#de77ae"
],
[
0.3,
"#f1b6da"
],
[
0.4,
"#fde0ef"
],
[
0.5,
"#f7f7f7"
],
[
0.6,
"#e6f5d0"
],
[
0.7,
"#b8e186"
],
[
0.8,
"#7fbc41"
],
[
0.9,
"#4d9221"
],
[
1,
"#276419"
]
],
"sequential": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"sequentialminus": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
]
},
"colorway": [
"#636efa",
"#EF553B",
"#00cc96",
"#ab63fa",
"#FFA15A",
"#19d3f3",
"#FF6692",
"#B6E880",
"#FF97FF",
"#FECB52"
],
"font": {
"color": "#2a3f5f"
},
"geo": {
"bgcolor": "white",
"lakecolor": "white",
"landcolor": "#E5ECF6",
"showlakes": true,
"showland": true,
"subunitcolor": "white"
},
"hoverlabel": {
"align": "left"
},
"hovermode": "closest",
"mapbox": {
"style": "light"
},
"paper_bgcolor": "white",
"plot_bgcolor": "#E5ECF6",
"polar": {
"angularaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"radialaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"scene": {
"xaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"yaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"zaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
}
},
"shapedefaults": {
"line": {
"color": "#2a3f5f"
}
},
"ternary": {
"aaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"baxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"caxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"title": {
"x": 0.05
},
"xaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
},
"yaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
}
}
},
"width": 500,
"xaxis": {
"anchor": "y",
"domain": [
0,
0.98
],
"title": {
"text": "cosine_similarity"
}
},
"xaxis2": {
"anchor": "y2",
"domain": [
0,
0.98
],
"matches": "x",
"showticklabels": false
},
"yaxis": {
"anchor": "x",
"domain": [
0,
0.485
],
"title": {
"text": "count"
}
},
"yaxis2": {
"anchor": "x2",
"domain": [
0.515,
1
],
"matches": "y",
"title": {
"text": "count"
}
}
}
}
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"train accuracy: 87.9% ± 2.5%\n",
"test accuracy: 88.5% ± 2.4%\n"
2022-03-11 02:08:53 +00:00
]
}
],
"source": [
"# calculate accuracy (and its standard error) of predicting label=1 if similarity>x\n",
"# x is optimized by sweeping from -1 to 1 in steps of 0.01\n",
"def accuracy_and_se(cosine_similarity: float, labeled_similarity: int) -> Tuple[float]:\n",
2022-03-11 02:08:53 +00:00
" accuracies = []\n",
" for threshold_thousandths in range(-1000, 1000, 1):\n",
" threshold = threshold_thousandths / 1000\n",
" total = 0\n",
" correct = 0\n",
" for cs, ls in zip(cosine_similarity, labeled_similarity):\n",
" total += 1\n",
" if cs > threshold:\n",
" prediction = 1\n",
" else:\n",
" prediction = -1\n",
" if prediction == ls:\n",
" correct += 1\n",
" accuracy = correct / total\n",
" accuracies.append(accuracy)\n",
" a = max(accuracies)\n",
" n = len(cosine_similarity)\n",
" standard_error = (a * (1 - a) / n) ** 0.5 # standard error of binomial\n",
" return a, standard_error\n",
"\n",
"\n",
2022-03-11 02:08:53 +00:00
"# check that training and test sets are balanced\n",
"px.histogram(\n",
" df,\n",
" x=\"cosine_similarity\",\n",
" color=\"label\",\n",
" barmode=\"overlay\",\n",
" width=500,\n",
" facet_row=\"dataset\",\n",
").show()\n",
"\n",
"for dataset in [\"train\", \"test\"]:\n",
" data = df[df[\"dataset\"] == dataset]\n",
" a, se = accuracy_and_se(data[\"cosine_similarity\"], data[\"label\"])\n",
" print(f\"{dataset} accuracy: {a:0.1%} ± {1.96 * se:0.1%}\")\n"
2022-03-11 02:08:53 +00:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "zHLxlnsApgkR"
},
"source": [
"## 7. Optimize the matrix using the training data provided"
]
},
{
"cell_type": "code",
"execution_count": 9,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "z52V0x8IpgkR"
},
"outputs": [],
"source": [
"def embedding_multiplied_by_matrix(\n",
" embedding: List[float], matrix: torch.tensor\n",
") -> np.array:\n",
2022-03-11 02:08:53 +00:00
" embedding_tensor = torch.tensor(embedding).float()\n",
" modified_embedding = embedding_tensor @ matrix\n",
" modified_embedding = modified_embedding.detach().numpy()\n",
" return modified_embedding\n",
"\n",
"\n",
"# compute custom embeddings and new cosine similarities\n",
"def apply_matrix_to_embeddings_dataframe(matrix: torch.tensor, df: pd.DataFrame):\n",
" for column in [\"text_1_embedding\", \"text_2_embedding\"]:\n",
" df[f\"{column}_custom\"] = df[column].apply(\n",
" lambda x: embedding_multiplied_by_matrix(x, matrix)\n",
" )\n",
" df[\"cosine_similarity_custom\"] = df.apply(\n",
" lambda row: cosine_similarity(\n",
" row[\"text_1_embedding_custom\"], row[\"text_2_embedding_custom\"]\n",
" ),\n",
" axis=1,\n",
" )\n"
]
},
{
"cell_type": "code",
"execution_count": 10,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "p2ZSXu6spgkR"
},
"outputs": [],
"source": [
"def optimize_matrix(\n",
" modified_embedding_length: int = 2048, # in my brief experimentation, bigger was better (2048 is length of babbage encoding)\n",
" batch_size: int = 100,\n",
" max_epochs: int = 100,\n",
" learning_rate: float = 100.0, # seemed to work best when similar to batch size - feel free to try a range of values\n",
" dropout_fraction: float = 0.0, # in my testing, dropout helped by a couple percentage points (definitely not necessary)\n",
" df: pd.DataFrame = df,\n",
" print_progress: bool = True,\n",
" save_results: bool = True,\n",
") -> torch.tensor:\n",
" \"\"\"Return matrix optimized to minimize loss on training data.\"\"\"\n",
" run_id = random.randint(0, 2 ** 31 - 1) # (range is arbitrary)\n",
" # convert from dataframe to torch tensors\n",
" # e is for embedding, s for similarity label\n",
" def tensors_from_dataframe(\n",
" df: pd.DataFrame,\n",
" embedding_column_1: str,\n",
" embedding_column_2: str,\n",
" similarity_label_column: str,\n",
" ) -> Tuple[torch.tensor]:\n",
" e1 = np.stack(np.array(df[embedding_column_1].values))\n",
" e2 = np.stack(np.array(df[embedding_column_2].values))\n",
" s = np.stack(np.array(df[similarity_label_column].astype(\"float\").values))\n",
"\n",
" e1 = torch.from_numpy(e1).float()\n",
" e2 = torch.from_numpy(e2).float()\n",
" s = torch.from_numpy(s).float()\n",
"\n",
" return e1, e2, s\n",
"\n",
" e1_train, e2_train, s_train = tensors_from_dataframe(\n",
" df[df[\"dataset\"] == \"train\"], \"text_1_embedding\", \"text_2_embedding\", \"label\"\n",
" )\n",
" e1_test, e2_test, s_test = tensors_from_dataframe(\n",
" df[df[\"dataset\"] == \"train\"], \"text_1_embedding\", \"text_2_embedding\", \"label\"\n",
" )\n",
"\n",
" # create dataset and loader\n",
" dataset = torch.utils.data.TensorDataset(e1_train, e2_train, s_train)\n",
" train_loader = torch.utils.data.DataLoader(\n",
" dataset, batch_size=batch_size, shuffle=True\n",
" )\n",
"\n",
" # define model (similarity of projected embeddings)\n",
" def model(embedding_1, embedding_2, matrix, dropout_fraction=dropout_fraction):\n",
" e1 = torch.nn.functional.dropout(embedding_1, p=dropout_fraction)\n",
" e2 = torch.nn.functional.dropout(embedding_2, p=dropout_fraction)\n",
" modified_embedding_1 = e1 @ matrix # @ is matrix multiplication\n",
" modified_embedding_2 = e2 @ matrix\n",
" similarity = torch.nn.functional.cosine_similarity(\n",
" modified_embedding_1, modified_embedding_2\n",
" )\n",
" return similarity\n",
"\n",
" # define loss function to minimize\n",
" def mse_loss(predictions, targets):\n",
" difference = predictions - targets\n",
" return torch.sum(difference * difference) / difference.numel()\n",
"\n",
" # initialize projection matrix\n",
" embedding_length = len(df[\"text_1_embedding\"].values[0])\n",
" matrix = torch.randn(\n",
" embedding_length, modified_embedding_length, requires_grad=True\n",
" )\n",
"\n",
" epochs, types, losses, accuracies, matrices = [], [], [], [], []\n",
" for epoch in range(1, 1 + max_epochs):\n",
" # iterate through training dataloader\n",
" for a, b, actual_similarity in train_loader:\n",
" # generate prediction\n",
" predicted_similarity = model(a, b, matrix)\n",
" # get loss and perform backpropagation\n",
" loss = mse_loss(predicted_similarity, actual_similarity)\n",
" loss.backward()\n",
" # update the weights\n",
" with torch.no_grad():\n",
" matrix -= matrix.grad * learning_rate\n",
" # set gradients to zero\n",
" matrix.grad.zero_()\n",
" # calculate test loss\n",
" test_predictions = model(e1_test, e2_test, matrix)\n",
" test_loss = mse_loss(test_predictions, s_test)\n",
"\n",
" # compute custom embeddings and new cosine similarities\n",
" apply_matrix_to_embeddings_dataframe(matrix, df)\n",
"\n",
" # calculate test accuracy\n",
" for dataset in [\"train\", \"test\"]:\n",
" data = df[df[\"dataset\"] == dataset]\n",
" a, se = accuracy_and_se(data[\"cosine_similarity_custom\"], data[\"label\"])\n",
"\n",
" # record results of each epoch\n",
" epochs.append(epoch)\n",
" types.append(dataset)\n",
" losses.append(loss.item() if dataset == \"train\" else test_loss.item())\n",
" accuracies.append(a)\n",
" matrices.append(matrix.detach().numpy())\n",
"\n",
" # optionally print accuracies\n",
" if print_progress is True:\n",
" print(\n",
" f\"Epoch {epoch}/{max_epochs}: {dataset} accuracy: {a:0.1%} ± {1.96 * se:0.1%}\"\n",
" )\n",
"\n",
" data = pd.DataFrame(\n",
" {\"epoch\": epochs, \"type\": types, \"loss\": losses, \"accuracy\": accuracies}\n",
" )\n",
" data[\"run_id\"] = run_id\n",
" data[\"modified_embedding_length\"] = modified_embedding_length\n",
" data[\"batch_size\"] = batch_size\n",
" data[\"max_epochs\"] = max_epochs\n",
" data[\"learning_rate\"] = learning_rate\n",
" data[\"dropout_fraction\"] = dropout_fraction\n",
" data[\n",
" \"matrix\"\n",
" ] = matrices # saving every single matrix can get big; feel free to delete/change\n",
" if save_results is True:\n",
" data.to_csv(f\"{run_id}_optimization_results.csv\", index=False)\n",
"\n",
" return data\n"
]
},
{
"cell_type": "code",
"execution_count": 11,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "nlcUW-zEpgkS",
"outputId": "4bd4bdff-628a-406f-fffe-aedbfad66446"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/30: train accuracy: 87.6% ± 2.5%\n",
"Epoch 1/30: test accuracy: 88.4% ± 2.4%\n",
"Epoch 2/30: train accuracy: 88.0% ± 2.5%\n",
"Epoch 2/30: test accuracy: 88.8% ± 2.4%\n",
"Epoch 3/30: train accuracy: 89.4% ± 2.3%\n",
"Epoch 3/30: test accuracy: 88.7% ± 2.4%\n",
"Epoch 4/30: train accuracy: 89.8% ± 2.3%\n",
"Epoch 4/30: test accuracy: 88.4% ± 2.4%\n",
"Epoch 5/30: train accuracy: 90.4% ± 2.2%\n",
"Epoch 5/30: test accuracy: 88.5% ± 2.4%\n",
"Epoch 6/30: train accuracy: 90.6% ± 2.2%\n",
"Epoch 6/30: test accuracy: 88.8% ± 2.4%\n",
"Epoch 7/30: train accuracy: 90.9% ± 2.2%\n",
"Epoch 7/30: test accuracy: 89.1% ± 2.4%\n",
"Epoch 8/30: train accuracy: 91.3% ± 2.1%\n",
"Epoch 8/30: test accuracy: 89.3% ± 2.3%\n",
"Epoch 9/30: train accuracy: 91.3% ± 2.1%\n",
"Epoch 9/30: test accuracy: 89.6% ± 2.3%\n",
"Epoch 10/30: train accuracy: 91.6% ± 2.1%\n",
"Epoch 10/30: test accuracy: 89.6% ± 2.3%\n",
"Epoch 11/30: train accuracy: 91.8% ± 2.1%\n",
"Epoch 11/30: test accuracy: 89.9% ± 2.3%\n",
"Epoch 12/30: train accuracy: 92.1% ± 2.0%\n",
"Epoch 12/30: test accuracy: 90.0% ± 2.3%\n",
"Epoch 13/30: train accuracy: 92.4% ± 2.0%\n",
"Epoch 13/30: test accuracy: 90.3% ± 2.2%\n",
"Epoch 14/30: train accuracy: 92.4% ± 2.0%\n",
"Epoch 14/30: test accuracy: 90.3% ± 2.2%\n",
"Epoch 15/30: train accuracy: 92.7% ± 2.0%\n",
"Epoch 15/30: test accuracy: 90.4% ± 2.2%\n",
"Epoch 16/30: train accuracy: 92.7% ± 2.0%\n",
"Epoch 16/30: test accuracy: 90.6% ± 2.2%\n",
"Epoch 17/30: train accuracy: 92.5% ± 2.0%\n",
"Epoch 17/30: test accuracy: 90.9% ± 2.2%\n",
"Epoch 18/30: train accuracy: 92.7% ± 2.0%\n",
"Epoch 18/30: test accuracy: 91.3% ± 2.1%\n",
"Epoch 19/30: train accuracy: 92.8% ± 2.0%\n",
"Epoch 19/30: test accuracy: 91.5% ± 2.1%\n",
"Epoch 20/30: train accuracy: 93.0% ± 1.9%\n",
"Epoch 20/30: test accuracy: 91.6% ± 2.1%\n",
"Epoch 21/30: train accuracy: 93.3% ± 1.9%\n",
"Epoch 21/30: test accuracy: 91.6% ± 2.1%\n",
"Epoch 22/30: train accuracy: 93.3% ± 1.9%\n",
"Epoch 22/30: test accuracy: 91.6% ± 2.1%\n",
"Epoch 23/30: train accuracy: 93.4% ± 1.9%\n",
"Epoch 23/30: test accuracy: 91.8% ± 2.1%\n",
"Epoch 24/30: train accuracy: 93.6% ± 1.9%\n",
"Epoch 24/30: test accuracy: 91.8% ± 2.1%\n",
"Epoch 25/30: train accuracy: 93.6% ± 1.9%\n",
"Epoch 25/30: test accuracy: 91.8% ± 2.1%\n",
"Epoch 26/30: train accuracy: 93.6% ± 1.9%\n",
"Epoch 26/30: test accuracy: 92.1% ± 2.0%\n",
"Epoch 27/30: train accuracy: 93.7% ± 1.8%\n",
"Epoch 27/30: test accuracy: 92.2% ± 2.0%\n",
"Epoch 28/30: train accuracy: 93.7% ± 1.8%\n",
"Epoch 28/30: test accuracy: 92.2% ± 2.0%\n",
"Epoch 29/30: train accuracy: 93.9% ± 1.8%\n",
"Epoch 29/30: test accuracy: 92.2% ± 2.0%\n",
"Epoch 30/30: train accuracy: 93.9% ± 1.8%\n",
"Epoch 30/30: test accuracy: 92.4% ± 2.0%\n",
"Epoch 1/30: train accuracy: 88.0% ± 2.5%\n",
"Epoch 1/30: test accuracy: 88.7% ± 2.4%\n",
"Epoch 2/30: train accuracy: 88.8% ± 2.4%\n",
"Epoch 2/30: test accuracy: 88.7% ± 2.4%\n",
"Epoch 3/30: train accuracy: 89.5% ± 2.3%\n",
"Epoch 3/30: test accuracy: 89.3% ± 2.3%\n",
"Epoch 4/30: train accuracy: 89.8% ± 2.3%\n",
"Epoch 4/30: test accuracy: 88.8% ± 2.4%\n",
"Epoch 5/30: train accuracy: 90.1% ± 2.3%\n",
"Epoch 5/30: test accuracy: 89.1% ± 2.4%\n",
"Epoch 6/30: train accuracy: 90.4% ± 2.2%\n",
"Epoch 6/30: test accuracy: 89.3% ± 2.3%\n",
"Epoch 7/30: train accuracy: 91.0% ± 2.2%\n",
"Epoch 7/30: test accuracy: 89.6% ± 2.3%\n",
"Epoch 8/30: train accuracy: 91.3% ± 2.1%\n",
"Epoch 8/30: test accuracy: 90.0% ± 2.3%\n",
"Epoch 9/30: train accuracy: 91.5% ± 2.1%\n",
"Epoch 9/30: test accuracy: 90.3% ± 2.2%\n",
"Epoch 10/30: train accuracy: 91.6% ± 2.1%\n",
"Epoch 10/30: test accuracy: 90.4% ± 2.2%\n",
"Epoch 11/30: train accuracy: 91.9% ± 2.1%\n",
"Epoch 11/30: test accuracy: 90.6% ± 2.2%\n",
"Epoch 12/30: train accuracy: 91.9% ± 2.1%\n",
"Epoch 12/30: test accuracy: 90.7% ± 2.2%\n",
"Epoch 13/30: train accuracy: 91.9% ± 2.1%\n",
"Epoch 13/30: test accuracy: 90.7% ± 2.2%\n",
"Epoch 14/30: train accuracy: 92.1% ± 2.0%\n",
"Epoch 14/30: test accuracy: 90.9% ± 2.2%\n",
"Epoch 15/30: train accuracy: 92.2% ± 2.0%\n",
"Epoch 15/30: test accuracy: 91.0% ± 2.2%\n",
"Epoch 16/30: train accuracy: 92.5% ± 2.0%\n",
"Epoch 16/30: test accuracy: 91.5% ± 2.1%\n",
"Epoch 17/30: train accuracy: 92.8% ± 2.0%\n",
"Epoch 17/30: test accuracy: 91.6% ± 2.1%\n",
"Epoch 18/30: train accuracy: 93.0% ± 1.9%\n",
"Epoch 18/30: test accuracy: 91.8% ± 2.1%\n",
"Epoch 19/30: train accuracy: 93.1% ± 1.9%\n",
"Epoch 19/30: test accuracy: 91.9% ± 2.1%\n",
"Epoch 20/30: train accuracy: 93.1% ± 1.9%\n",
"Epoch 20/30: test accuracy: 91.9% ± 2.1%\n",
"Epoch 21/30: train accuracy: 93.3% ± 1.9%\n",
"Epoch 21/30: test accuracy: 92.1% ± 2.0%\n",
"Epoch 22/30: train accuracy: 93.4% ± 1.9%\n",
"Epoch 22/30: test accuracy: 92.4% ± 2.0%\n",
"Epoch 23/30: train accuracy: 93.6% ± 1.9%\n",
"Epoch 23/30: test accuracy: 92.7% ± 2.0%\n",
"Epoch 24/30: train accuracy: 93.7% ± 1.8%\n",
"Epoch 24/30: test accuracy: 92.7% ± 2.0%\n",
"Epoch 25/30: train accuracy: 93.9% ± 1.8%\n",
"Epoch 25/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 26/30: train accuracy: 93.9% ± 1.8%\n",
"Epoch 26/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 27/30: train accuracy: 93.9% ± 1.8%\n",
"Epoch 27/30: test accuracy: 93.1% ± 1.9%\n",
"Epoch 28/30: train accuracy: 94.0% ± 1.8%\n",
"Epoch 28/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 29/30: train accuracy: 94.2% ± 1.8%\n",
"Epoch 29/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 30/30: train accuracy: 94.3% ± 1.8%\n",
"Epoch 30/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 1/30: train accuracy: 87.1% ± 2.5%\n",
"Epoch 1/30: test accuracy: 88.1% ± 2.5%\n",
"Epoch 2/30: train accuracy: 88.6% ± 2.4%\n",
"Epoch 2/30: test accuracy: 88.2% ± 2.4%\n",
"Epoch 3/30: train accuracy: 89.4% ± 2.3%\n",
"Epoch 3/30: test accuracy: 89.0% ± 2.4%\n",
"Epoch 4/30: train accuracy: 90.3% ± 2.2%\n",
"Epoch 4/30: test accuracy: 89.3% ± 2.3%\n",
"Epoch 5/30: train accuracy: 90.4% ± 2.2%\n",
"Epoch 5/30: test accuracy: 89.4% ± 2.3%\n",
"Epoch 6/30: train accuracy: 91.0% ± 2.2%\n",
"Epoch 6/30: test accuracy: 89.9% ± 2.3%\n",
"Epoch 7/30: train accuracy: 91.5% ± 2.1%\n",
"Epoch 7/30: test accuracy: 90.1% ± 2.3%\n",
"Epoch 8/30: train accuracy: 91.6% ± 2.1%\n",
"Epoch 8/30: test accuracy: 90.4% ± 2.2%\n",
"Epoch 9/30: train accuracy: 91.9% ± 2.1%\n",
"Epoch 9/30: test accuracy: 90.7% ± 2.2%\n",
"Epoch 10/30: train accuracy: 92.2% ± 2.0%\n",
"Epoch 10/30: test accuracy: 90.9% ± 2.2%\n",
"Epoch 11/30: train accuracy: 92.4% ± 2.0%\n",
"Epoch 11/30: test accuracy: 91.2% ± 2.1%\n",
"Epoch 12/30: train accuracy: 92.5% ± 2.0%\n",
"Epoch 12/30: test accuracy: 91.2% ± 2.1%\n",
"Epoch 13/30: train accuracy: 92.7% ± 2.0%\n",
"Epoch 13/30: test accuracy: 91.3% ± 2.1%\n",
"Epoch 14/30: train accuracy: 92.8% ± 2.0%\n",
"Epoch 14/30: test accuracy: 91.5% ± 2.1%\n",
"Epoch 15/30: train accuracy: 93.1% ± 1.9%\n",
"Epoch 15/30: test accuracy: 91.8% ± 2.1%\n",
"Epoch 16/30: train accuracy: 93.3% ± 1.9%\n",
"Epoch 16/30: test accuracy: 91.6% ± 2.1%\n",
"Epoch 17/30: train accuracy: 93.4% ± 1.9%\n",
"Epoch 17/30: test accuracy: 91.9% ± 2.1%\n",
"Epoch 18/30: train accuracy: 93.6% ± 1.9%\n",
"Epoch 18/30: test accuracy: 92.1% ± 2.0%\n",
"Epoch 19/30: train accuracy: 93.9% ± 1.8%\n",
"Epoch 19/30: test accuracy: 92.1% ± 2.0%\n",
"Epoch 20/30: train accuracy: 94.0% ± 1.8%\n",
"Epoch 20/30: test accuracy: 92.2% ± 2.0%\n",
"Epoch 21/30: train accuracy: 94.0% ± 1.8%\n",
"Epoch 21/30: test accuracy: 92.2% ± 2.0%\n",
"Epoch 22/30: train accuracy: 94.0% ± 1.8%\n",
"Epoch 22/30: test accuracy: 92.5% ± 2.0%\n",
"Epoch 23/30: train accuracy: 94.2% ± 1.8%\n",
"Epoch 23/30: test accuracy: 92.5% ± 2.0%\n",
"Epoch 24/30: train accuracy: 94.2% ± 1.8%\n",
"Epoch 24/30: test accuracy: 92.5% ± 2.0%\n",
"Epoch 25/30: train accuracy: 94.3% ± 1.8%\n",
"Epoch 25/30: test accuracy: 92.7% ± 2.0%\n",
"Epoch 26/30: train accuracy: 94.5% ± 1.7%\n",
"Epoch 26/30: test accuracy: 92.7% ± 2.0%\n",
"Epoch 27/30: train accuracy: 94.5% ± 1.7%\n",
"Epoch 27/30: test accuracy: 92.8% ± 2.0%\n",
"Epoch 28/30: train accuracy: 94.5% ± 1.7%\n",
"Epoch 28/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 29/30: train accuracy: 94.5% ± 1.7%\n",
"Epoch 29/30: test accuracy: 93.0% ± 1.9%\n",
"Epoch 30/30: train accuracy: 94.6% ± 1.7%\n",
"Epoch 30/30: test accuracy: 93.1% ± 1.9%\n"
2022-03-11 02:08:53 +00:00
]
}
],
"source": [
"# example hyperparameter search\n",
"# I recommend starting with max_epochs=10 while initially exploring\n",
"results = []\n",
"max_epochs = 30\n",
"dropout_fraction = 0.2\n",
"for batch_size, learning_rate in [(10, 10), (100, 100), (1000, 1000)]:\n",
" result = optimize_matrix(\n",
" batch_size=batch_size,\n",
" learning_rate=learning_rate,\n",
" max_epochs=max_epochs,\n",
" dropout_fraction=dropout_fraction,\n",
" save_results=False,\n",
2022-03-11 02:08:53 +00:00
" )\n",
" results.append(result)\n"
]
},
{
"cell_type": "code",
"execution_count": 12,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "PoTZWC1SpgkS",
"outputId": "207360e5-fd07-4180-a143-0ec5dd27ffe1"
},
"outputs": [
{
"data": {
"application/vnd.plotly.v1+json": {
"config": {
"plotlyServerURL": "https://plot.ly"
},
"data": [
{
"customdata": [
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
]
],
"hovertemplate": "type=train<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1966966104<br>epoch=%{x}<br>loss=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "train",
"line": {
"color": "#636efa",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "train",
"orientation": "v",
"showlegend": true,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x7",
"y": [
1.5030940771102905,
0.8207949995994568,
0.8773147463798523,
0.8613348603248596,
0.6944154500961304,
0.6347973346710205,
0.702663004398346,
0.6778457164764404,
0.7463836073875427,
0.70860755443573,
0.829717755317688,
0.8753337264060974,
0.9517255425453186,
0.745735764503479,
0.8445643186569214,
0.6972991824150085,
0.7754084467887878,
0.8590495586395264,
0.7757335901260376,
0.8903113007545471,
0.7940390110015869,
0.7357985973358154,
0.7940884828567505,
0.7627162933349609,
0.6855032444000244,
0.9467343091964722,
0.7323927879333496,
0.7784497737884521,
0.6060311794281006,
0.5107579231262207
2022-03-11 02:08:53 +00:00
],
"yaxis": "y7"
},
{
"customdata": [
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
]
],
"hovertemplate": "type=train<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1295327779<br>epoch=%{x}<br>loss=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "train",
"line": {
"color": "#636efa",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "train",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x5",
"y": [
1.3088507652282715,
0.9660036563873291,
0.880219042301178,
0.8399844169616699,
0.8322086334228516,
0.8545096516609192,
0.7623728513717651,
0.8051943778991699,
0.7696108818054199,
0.737562358379364,
0.7953993082046509,
0.7336588501930237,
0.7842012047767639,
0.7281075119972229,
0.7858664393424988,
0.7507467865943909,
0.7651264071464539,
0.7124024629592896,
0.7779631614685059,
0.7413207292556763,
0.7219211459159851,
0.7167943716049194,
0.7299365401268005,
0.7841536998748779,
0.6576091051101685,
0.7138631343841553,
0.7275545597076416,
0.7669041752815247,
0.7839182019233704,
0.6869834661483765
2022-03-11 02:08:53 +00:00
],
"yaxis": "y5"
},
{
"customdata": [
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
]
],
"hovertemplate": "type=train<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=2104013193<br>epoch=%{x}<br>loss=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "train",
"line": {
"color": "#636efa",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "train",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x3",
"y": [
1.2908055782318115,
1.0658448934555054,
0.8479723930358887,
0.7903857827186584,
0.7801042199134827,
0.775238573551178,
0.7691900730133057,
0.7682435512542725,
0.765575647354126,
0.7624145150184631,
0.7591218948364258,
0.7522261142730713,
0.7516769766807556,
0.7472819685935974,
0.7461873292922974,
0.741317629814148,
0.7395408153533936,
0.7369651794433594,
0.735289990901947,
0.7335363030433655,
0.7322248220443726,
0.7293906211853027,
0.7247920036315918,
0.7275153398513794,
0.7206876873970032,
0.7157735824584961,
0.7161506414413452,
0.7103999257087708,
0.7092854380607605,
0.712189793586731
2022-03-11 02:08:53 +00:00
],
"yaxis": "y3"
},
{
"customdata": [
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
]
],
"hovertemplate": "type=test<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1966966104<br>epoch=%{x}<br>loss=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "test",
"line": {
"color": "#EF553B",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "test",
"orientation": "v",
"showlegend": true,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x7",
"y": [
1.1806079149246216,
1.0105031728744507,
0.8882337808609009,
0.8209589719772339,
0.7949825525283813,
0.7838609218597412,
0.7761262655258179,
0.773082435131073,
0.7749849557876587,
0.7681840062141418,
0.7599595785140991,
0.7643077969551086,
0.7606286406517029,
0.7596479654312134,
0.7551064491271973,
0.756072461605072,
0.7553612589836121,
0.753299355506897,
0.7488323450088501,
0.7476798295974731,
0.7407615780830383,
0.7410925030708313,
0.7392419576644897,
0.7380637526512146,
0.7404897212982178,
0.7336447834968567,
0.7395608425140381,
0.7306341528892517,
0.7296173572540283,
0.7306488156318665
2022-03-11 02:08:53 +00:00
],
"yaxis": "y7"
},
{
"customdata": [
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
]
],
"hovertemplate": "type=test<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1295327779<br>epoch=%{x}<br>loss=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "test",
"line": {
"color": "#EF553B",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "test",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x5",
"y": [
1.138457179069519,
0.9624069929122925,
0.854874312877655,
0.8080407977104187,
0.7818027138710022,
0.7792102694511414,
0.774451732635498,
0.7670709490776062,
0.7700219750404358,
0.7601531147956848,
0.7579351663589478,
0.7570049166679382,
0.7558397650718689,
0.7539790868759155,
0.7552328109741211,
0.7481580972671509,
0.7525877952575684,
0.7411936521530151,
0.7430469989776611,
0.7382174134254456,
0.735832691192627,
0.7340396642684937,
0.735627293586731,
0.735277533531189,
0.734030544757843,
0.7297038435935974,
0.7263182401657104,
0.7299252152442932,
0.72670578956604,
0.7222384810447693
2022-03-11 02:08:53 +00:00
],
"yaxis": "y5"
},
{
"customdata": [
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
]
],
"hovertemplate": "type=test<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=2104013193<br>epoch=%{x}<br>loss=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "test",
"line": {
"color": "#EF553B",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "test",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x3",
"y": [
1.0625630617141724,
0.8497532606124878,
0.7860064506530762,
0.7821206450462341,
0.7733925580978394,
0.7731095552444458,
0.7659758925437927,
0.7659748792648315,
0.7658429145812988,
0.7546908259391785,
0.751862645149231,
0.7525656223297119,
0.7430839538574219,
0.7444459199905396,
0.7434905767440796,
0.7453382611274719,
0.7397355437278748,
0.7330746054649353,
0.735709547996521,
0.7287493348121643,
0.725609302520752,
0.7301373481750488,
0.7209747433662415,
0.7206069827079773,
0.7129198908805847,
0.7210001945495605,
0.7170187830924988,
0.713289201259613,
0.708259105682373,
0.7038670778274536
2022-03-11 02:08:53 +00:00
],
"yaxis": "y3"
}
],
"layout": {
"annotations": [
{
"font": {},
"showarrow": false,
"text": "batch_size=10",
"x": 0.15666666666666665,
"xanchor": "center",
"xref": "paper",
"y": 0.9999999999999998,
"yanchor": "bottom",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "batch_size=100",
"x": 0.49,
"xanchor": "center",
"xref": "paper",
"y": 0.9999999999999998,
"yanchor": "bottom",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "batch_size=1000",
"x": 0.8233333333333333,
"xanchor": "center",
"xref": "paper",
"y": 0.9999999999999998,
"yanchor": "bottom",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "learning_rate=1000",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.15666666666666665,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "learning_rate=100",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.4999999999999999,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "learning_rate=10",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.8433333333333332,
"yanchor": "middle",
"yref": "paper"
}
],
"legend": {
"title": {
"text": "type"
},
"tracegroupgap": 0
},
"margin": {
"t": 60
},
"template": {
"data": {
"bar": [
{
"error_x": {
"color": "#2a3f5f"
},
"error_y": {
"color": "#2a3f5f"
},
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "bar"
}
],
"barpolar": [
{
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "barpolar"
}
],
"carpet": [
{
"aaxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"baxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"type": "carpet"
}
],
"choropleth": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "choropleth"
}
],
"contour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "contour"
}
],
"contourcarpet": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "contourcarpet"
}
],
"heatmap": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmap"
}
],
"heatmapgl": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmapgl"
}
],
"histogram": [
{
"marker": {
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "histogram"
}
],
"histogram2d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2d"
}
],
"histogram2dcontour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2dcontour"
}
],
"mesh3d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "mesh3d"
}
],
"parcoords": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "parcoords"
}
],
"pie": [
{
"automargin": true,
"type": "pie"
}
],
"scatter": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter"
}
],
"scatter3d": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter3d"
}
],
"scattercarpet": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattercarpet"
}
],
"scattergeo": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergeo"
}
],
"scattergl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergl"
}
],
"scattermapbox": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattermapbox"
}
],
"scatterpolar": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolar"
}
],
"scatterpolargl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolargl"
}
],
"scatterternary": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterternary"
}
],
"surface": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "surface"
}
],
"table": [
{
"cells": {
"fill": {
"color": "#EBF0F8"
},
"line": {
"color": "white"
}
},
"header": {
"fill": {
"color": "#C8D4E3"
},
"line": {
"color": "white"
}
},
"type": "table"
}
]
},
"layout": {
"annotationdefaults": {
"arrowcolor": "#2a3f5f",
"arrowhead": 0,
"arrowwidth": 1
},
"autotypenumbers": "strict",
"coloraxis": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"colorscale": {
"diverging": [
[
0,
"#8e0152"
],
[
0.1,
"#c51b7d"
],
[
0.2,
"#de77ae"
],
[
0.3,
"#f1b6da"
],
[
0.4,
"#fde0ef"
],
[
0.5,
"#f7f7f7"
],
[
0.6,
"#e6f5d0"
],
[
0.7,
"#b8e186"
],
[
0.8,
"#7fbc41"
],
[
0.9,
"#4d9221"
],
[
1,
"#276419"
]
],
"sequential": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"sequentialminus": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
]
},
"colorway": [
"#636efa",
"#EF553B",
"#00cc96",
"#ab63fa",
"#FFA15A",
"#19d3f3",
"#FF6692",
"#B6E880",
"#FF97FF",
"#FECB52"
],
"font": {
"color": "#2a3f5f"
},
"geo": {
"bgcolor": "white",
"lakecolor": "white",
"landcolor": "#E5ECF6",
"showlakes": true,
"showland": true,
"subunitcolor": "white"
},
"hoverlabel": {
"align": "left"
},
"hovermode": "closest",
"mapbox": {
"style": "light"
},
"paper_bgcolor": "white",
"plot_bgcolor": "#E5ECF6",
"polar": {
"angularaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"radialaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"scene": {
"xaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"yaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"zaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
}
},
"shapedefaults": {
"line": {
"color": "#2a3f5f"
}
},
"ternary": {
"aaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"baxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"caxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"title": {
"x": 0.05
},
"xaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
},
"yaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
}
}
},
"width": 500,
"xaxis": {
"anchor": "y",
"domain": [
0,
0.3133333333333333
],
"title": {
"text": "epoch"
}
},
"xaxis2": {
"anchor": "y2",
"domain": [
0.3333333333333333,
0.6466666666666666
],
"matches": "x",
"title": {
"text": "epoch"
}
},
"xaxis3": {
"anchor": "y3",
"domain": [
0.6666666666666666,
0.98
],
"matches": "x",
"title": {
"text": "epoch"
}
},
"xaxis4": {
"anchor": "y4",
"domain": [
0,
0.3133333333333333
],
"matches": "x",
"showticklabels": false
},
"xaxis5": {
"anchor": "y5",
"domain": [
0.3333333333333333,
0.6466666666666666
],
"matches": "x",
"showticklabels": false
},
"xaxis6": {
"anchor": "y6",
"domain": [
0.6666666666666666,
0.98
],
"matches": "x",
"showticklabels": false
},
"xaxis7": {
"anchor": "y7",
"domain": [
0,
0.3133333333333333
],
"matches": "x",
"showticklabels": false
},
"xaxis8": {
"anchor": "y8",
"domain": [
0.3333333333333333,
0.6466666666666666
],
"matches": "x",
"showticklabels": false
},
"xaxis9": {
"anchor": "y9",
"domain": [
0.6666666666666666,
0.98
],
"matches": "x",
"showticklabels": false
},
"yaxis": {
"anchor": "x",
"domain": [
0,
0.3133333333333333
],
"title": {
"text": "loss"
}
},
"yaxis2": {
"anchor": "x2",
"domain": [
0,
0.3133333333333333
],
"matches": "y",
"showticklabels": false
},
"yaxis3": {
"anchor": "x3",
"domain": [
0,
0.3133333333333333
],
"matches": "y",
"showticklabels": false
},
"yaxis4": {
"anchor": "x4",
"domain": [
0.34333333333333327,
0.6566666666666665
],
"matches": "y",
"title": {
"text": "loss"
}
},
"yaxis5": {
"anchor": "x5",
"domain": [
0.34333333333333327,
0.6566666666666665
],
"matches": "y",
"showticklabels": false
},
"yaxis6": {
"anchor": "x6",
"domain": [
0.34333333333333327,
0.6566666666666665
],
"matches": "y",
"showticklabels": false
},
"yaxis7": {
"anchor": "x7",
"domain": [
0.6866666666666665,
0.9999999999999998
],
"matches": "y",
"title": {
"text": "loss"
}
},
"yaxis8": {
"anchor": "x8",
"domain": [
0.6866666666666665,
0.9999999999999998
],
"matches": "y",
"showticklabels": false
},
"yaxis9": {
"anchor": "x9",
"domain": [
0.6866666666666665,
0.9999999999999998
],
"matches": "y",
"showticklabels": false
}
}
}
},
"metadata": {},
"output_type": "display_data"
},
{
"data": {
"application/vnd.plotly.v1+json": {
"config": {
"plotlyServerURL": "https://plot.ly"
},
"data": [
{
"customdata": [
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
]
],
"hovertemplate": "type=train<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1966966104<br>epoch=%{x}<br>accuracy=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "train",
"line": {
"color": "#636efa",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "train",
"orientation": "v",
"showlegend": true,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x7",
"y": [
0.875748502994012,
0.8802395209580839,
0.8937125748502994,
0.8982035928143712,
0.9041916167664671,
0.905688622754491,
0.9086826347305389,
0.9131736526946108,
0.9131736526946108,
0.9161676646706587,
0.9176646706586826,
0.9206586826347305,
0.9236526946107785,
0.9236526946107785,
0.9266467065868264,
0.9266467065868264,
0.9251497005988024,
0.9266467065868264,
0.9281437125748503,
0.9296407185628742,
0.9326347305389222,
0.9326347305389222,
0.9341317365269461,
0.9356287425149701,
0.9356287425149701,
0.9356287425149701,
0.937125748502994,
0.937125748502994,
0.938622754491018,
0.938622754491018
2022-03-11 02:08:53 +00:00
],
"yaxis": "y7"
},
{
"customdata": [
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
]
],
"hovertemplate": "type=train<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1295327779<br>epoch=%{x}<br>accuracy=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "train",
"line": {
"color": "#636efa",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "train",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x5",
"y": [
0.8802395209580839,
0.8877245508982036,
0.8952095808383234,
0.8982035928143712,
0.9011976047904192,
0.9041916167664671,
0.9101796407185628,
0.9131736526946108,
0.9146706586826348,
0.9161676646706587,
0.9191616766467066,
0.9191616766467066,
0.9191616766467066,
0.9206586826347305,
0.9221556886227545,
0.9251497005988024,
0.9281437125748503,
0.9296407185628742,
0.9311377245508982,
0.9311377245508982,
0.9326347305389222,
0.9341317365269461,
0.9356287425149701,
0.937125748502994,
0.938622754491018,
0.938622754491018,
0.938622754491018,
0.9401197604790419,
0.9416167664670658,
0.9431137724550899
2022-03-11 02:08:53 +00:00
],
"yaxis": "y5"
},
{
"customdata": [
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
]
],
"hovertemplate": "type=train<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=2104013193<br>epoch=%{x}<br>accuracy=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "train",
"line": {
"color": "#636efa",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "train",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x3",
"y": [
0.8712574850299402,
0.8862275449101796,
0.8937125748502994,
0.9026946107784432,
0.9041916167664671,
0.9101796407185628,
0.9146706586826348,
0.9161676646706587,
0.9191616766467066,
0.9221556886227545,
0.9236526946107785,
0.9251497005988024,
0.9266467065868264,
0.9281437125748503,
0.9311377245508982,
0.9326347305389222,
0.9341317365269461,
0.9356287425149701,
0.938622754491018,
0.9401197604790419,
0.9401197604790419,
0.9401197604790419,
0.9416167664670658,
0.9416167664670658,
0.9431137724550899,
0.9446107784431138,
0.9446107784431138,
0.9446107784431138,
0.9446107784431138,
0.9461077844311377
2022-03-11 02:08:53 +00:00
],
"yaxis": "y3"
},
{
"customdata": [
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
],
[
10,
10,
0.2
]
],
"hovertemplate": "type=test<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1966966104<br>epoch=%{x}<br>accuracy=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "test",
"line": {
"color": "#EF553B",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "test",
"orientation": "v",
"showlegend": true,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x7",
"y": [
0.8835820895522388,
0.8880597014925373,
0.8865671641791045,
0.8835820895522388,
0.8850746268656716,
0.8880597014925373,
0.891044776119403,
0.8925373134328358,
0.8955223880597015,
0.8955223880597015,
0.8985074626865671,
0.9,
0.9029850746268657,
0.9029850746268657,
0.9044776119402985,
0.9059701492537313,
0.908955223880597,
0.9134328358208955,
0.9149253731343283,
0.9164179104477612,
0.9164179104477612,
0.9164179104477612,
0.917910447761194,
0.917910447761194,
0.917910447761194,
0.9208955223880597,
0.9223880597014925,
0.9223880597014925,
0.9223880597014925,
0.9238805970149254
2022-03-11 02:08:53 +00:00
],
"yaxis": "y7"
},
{
"customdata": [
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
],
[
100,
100,
0.2
]
],
"hovertemplate": "type=test<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=1295327779<br>epoch=%{x}<br>accuracy=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "test",
"line": {
"color": "#EF553B",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "test",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x5",
"y": [
0.8865671641791045,
0.8865671641791045,
0.8925373134328358,
0.8880597014925373,
0.891044776119403,
0.8925373134328358,
0.8955223880597015,
0.9,
0.9029850746268657,
0.9044776119402985,
0.9059701492537313,
0.9074626865671642,
0.9074626865671642,
0.908955223880597,
0.9104477611940298,
0.9149253731343283,
0.9164179104477612,
0.917910447761194,
0.9194029850746268,
0.9194029850746268,
0.9208955223880597,
0.9238805970149254,
0.926865671641791,
0.926865671641791,
0.9298507462686567,
0.9298507462686567,
0.9313432835820895,
0.9298507462686567,
0.9298507462686567,
0.9298507462686567
2022-03-11 02:08:53 +00:00
],
"yaxis": "y5"
},
{
"customdata": [
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
],
[
1000,
1000,
0.2
]
],
"hovertemplate": "type=test<br>learning_rate=%{customdata[1]}<br>batch_size=%{customdata[0]}<br>run_id=2104013193<br>epoch=%{x}<br>accuracy=%{y}<br>dropout_fraction=%{customdata[2]}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "test",
"line": {
"color": "#EF553B",
"dash": "solid"
},
"marker": {
"symbol": "circle"
},
"mode": "lines",
"name": "test",
"orientation": "v",
"showlegend": false,
"type": "scatter",
"x": [
1,
2,
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16,
17,
18,
19,
20,
21,
22,
23,
24,
25,
26,
27,
28,
29,
30
],
"xaxis": "x3",
"y": [
0.8805970149253731,
0.8820895522388059,
0.8895522388059701,
0.8925373134328358,
0.8940298507462686,
0.8985074626865671,
0.9014925373134328,
0.9044776119402985,
0.9074626865671642,
0.908955223880597,
0.9119402985074627,
0.9119402985074627,
0.9134328358208955,
0.9149253731343283,
0.917910447761194,
0.9164179104477612,
0.9194029850746268,
0.9208955223880597,
0.9208955223880597,
0.9223880597014925,
0.9223880597014925,
0.9253731343283582,
0.9253731343283582,
0.9253731343283582,
0.926865671641791,
0.926865671641791,
0.9283582089552239,
0.9298507462686567,
0.9298507462686567,
0.9313432835820895
2022-03-11 02:08:53 +00:00
],
"yaxis": "y3"
}
],
"layout": {
"annotations": [
{
"font": {},
"showarrow": false,
"text": "batch_size=10",
"x": 0.15666666666666665,
"xanchor": "center",
"xref": "paper",
"y": 0.9999999999999998,
"yanchor": "bottom",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "batch_size=100",
"x": 0.49,
"xanchor": "center",
"xref": "paper",
"y": 0.9999999999999998,
"yanchor": "bottom",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "batch_size=1000",
"x": 0.8233333333333333,
"xanchor": "center",
"xref": "paper",
"y": 0.9999999999999998,
"yanchor": "bottom",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "learning_rate=1000",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.15666666666666665,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "learning_rate=100",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.4999999999999999,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "learning_rate=10",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.8433333333333332,
"yanchor": "middle",
"yref": "paper"
}
],
"legend": {
"title": {
"text": "type"
},
"tracegroupgap": 0
},
"margin": {
"t": 60
},
"template": {
"data": {
"bar": [
{
"error_x": {
"color": "#2a3f5f"
},
"error_y": {
"color": "#2a3f5f"
},
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "bar"
}
],
"barpolar": [
{
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "barpolar"
}
],
"carpet": [
{
"aaxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"baxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"type": "carpet"
}
],
"choropleth": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "choropleth"
}
],
"contour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "contour"
}
],
"contourcarpet": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "contourcarpet"
}
],
"heatmap": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmap"
}
],
"heatmapgl": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmapgl"
}
],
"histogram": [
{
"marker": {
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "histogram"
}
],
"histogram2d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2d"
}
],
"histogram2dcontour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2dcontour"
}
],
"mesh3d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "mesh3d"
}
],
"parcoords": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "parcoords"
}
],
"pie": [
{
"automargin": true,
"type": "pie"
}
],
"scatter": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter"
}
],
"scatter3d": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter3d"
}
],
"scattercarpet": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattercarpet"
}
],
"scattergeo": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergeo"
}
],
"scattergl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergl"
}
],
"scattermapbox": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattermapbox"
}
],
"scatterpolar": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolar"
}
],
"scatterpolargl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolargl"
}
],
"scatterternary": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterternary"
}
],
"surface": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "surface"
}
],
"table": [
{
"cells": {
"fill": {
"color": "#EBF0F8"
},
"line": {
"color": "white"
}
},
"header": {
"fill": {
"color": "#C8D4E3"
},
"line": {
"color": "white"
}
},
"type": "table"
}
]
},
"layout": {
"annotationdefaults": {
"arrowcolor": "#2a3f5f",
"arrowhead": 0,
"arrowwidth": 1
},
"autotypenumbers": "strict",
"coloraxis": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"colorscale": {
"diverging": [
[
0,
"#8e0152"
],
[
0.1,
"#c51b7d"
],
[
0.2,
"#de77ae"
],
[
0.3,
"#f1b6da"
],
[
0.4,
"#fde0ef"
],
[
0.5,
"#f7f7f7"
],
[
0.6,
"#e6f5d0"
],
[
0.7,
"#b8e186"
],
[
0.8,
"#7fbc41"
],
[
0.9,
"#4d9221"
],
[
1,
"#276419"
]
],
"sequential": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"sequentialminus": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
]
},
"colorway": [
"#636efa",
"#EF553B",
"#00cc96",
"#ab63fa",
"#FFA15A",
"#19d3f3",
"#FF6692",
"#B6E880",
"#FF97FF",
"#FECB52"
],
"font": {
"color": "#2a3f5f"
},
"geo": {
"bgcolor": "white",
"lakecolor": "white",
"landcolor": "#E5ECF6",
"showlakes": true,
"showland": true,
"subunitcolor": "white"
},
"hoverlabel": {
"align": "left"
},
"hovermode": "closest",
"mapbox": {
"style": "light"
},
"paper_bgcolor": "white",
"plot_bgcolor": "#E5ECF6",
"polar": {
"angularaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"radialaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"scene": {
"xaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"yaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"zaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
}
},
"shapedefaults": {
"line": {
"color": "#2a3f5f"
}
},
"ternary": {
"aaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"baxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"caxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"title": {
"x": 0.05
},
"xaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
},
"yaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
}
}
},
"width": 500,
"xaxis": {
"anchor": "y",
"domain": [
0,
0.3133333333333333
],
"title": {
"text": "epoch"
}
},
"xaxis2": {
"anchor": "y2",
"domain": [
0.3333333333333333,
0.6466666666666666
],
"matches": "x",
"title": {
"text": "epoch"
}
},
"xaxis3": {
"anchor": "y3",
"domain": [
0.6666666666666666,
0.98
],
"matches": "x",
"title": {
"text": "epoch"
}
},
"xaxis4": {
"anchor": "y4",
"domain": [
0,
0.3133333333333333
],
"matches": "x",
"showticklabels": false
},
"xaxis5": {
"anchor": "y5",
"domain": [
0.3333333333333333,
0.6466666666666666
],
"matches": "x",
"showticklabels": false
},
"xaxis6": {
"anchor": "y6",
"domain": [
0.6666666666666666,
0.98
],
"matches": "x",
"showticklabels": false
},
"xaxis7": {
"anchor": "y7",
"domain": [
0,
0.3133333333333333
],
"matches": "x",
"showticklabels": false
},
"xaxis8": {
"anchor": "y8",
"domain": [
0.3333333333333333,
0.6466666666666666
],
"matches": "x",
"showticklabels": false
},
"xaxis9": {
"anchor": "y9",
"domain": [
0.6666666666666666,
0.98
],
"matches": "x",
"showticklabels": false
},
"yaxis": {
"anchor": "x",
"domain": [
0,
0.3133333333333333
],
"title": {
"text": "accuracy"
}
},
"yaxis2": {
"anchor": "x2",
"domain": [
0,
0.3133333333333333
],
"matches": "y",
"showticklabels": false
},
"yaxis3": {
"anchor": "x3",
"domain": [
0,
0.3133333333333333
],
"matches": "y",
"showticklabels": false
},
"yaxis4": {
"anchor": "x4",
"domain": [
0.34333333333333327,
0.6566666666666665
],
"matches": "y",
"title": {
"text": "accuracy"
}
},
"yaxis5": {
"anchor": "x5",
"domain": [
0.34333333333333327,
0.6566666666666665
],
"matches": "y",
"showticklabels": false
},
"yaxis6": {
"anchor": "x6",
"domain": [
0.34333333333333327,
0.6566666666666665
],
"matches": "y",
"showticklabels": false
},
"yaxis7": {
"anchor": "x7",
"domain": [
0.6866666666666665,
0.9999999999999998
],
"matches": "y",
"title": {
"text": "accuracy"
}
},
"yaxis8": {
"anchor": "x8",
"domain": [
0.6866666666666665,
0.9999999999999998
],
"matches": "y",
"showticklabels": false
},
"yaxis9": {
"anchor": "x9",
"domain": [
0.6866666666666665,
0.9999999999999998
],
"matches": "y",
"showticklabels": false
}
}
}
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"runs_df = pd.concat(results)\n",
"\n",
"# plot training loss and test loss over time\n",
"px.line(\n",
" runs_df,\n",
" line_group=\"run_id\",\n",
2022-03-11 02:08:53 +00:00
" x=\"epoch\",\n",
" y=\"loss\",\n",
" color=\"type\",\n",
" hover_data=[\"batch_size\", \"learning_rate\", \"dropout_fraction\"],\n",
" facet_row=\"learning_rate\",\n",
" facet_col=\"batch_size\",\n",
" width=500,\n",
").show()\n",
"\n",
"# plot accuracy over time\n",
"px.line(\n",
" runs_df,\n",
" line_group=\"run_id\",\n",
" x=\"epoch\",\n",
" y=\"accuracy\",\n",
" color=\"type\",\n",
" hover_data=[\"batch_size\", \"learning_rate\", \"dropout_fraction\"],\n",
" facet_row=\"learning_rate\",\n",
" facet_col=\"batch_size\",\n",
" width=500,\n",
").show()\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "MiBQDcMPpgkS"
},
"source": [
"## 8. Plot the before & after, showing the results of the best matrix found during training\n",
"\n",
"The better the matrix is, the more cleanly it will separate the similar and dissimilar pairs."
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"id": "hzjoyLDOpgkS"
},
"outputs": [],
"source": [
"# apply result of best run to original data\n",
"best_run = runs_df.sort_values(by=\"accuracy\", ascending=False).iloc[0]\n",
"best_matrix = best_run[\"matrix\"]\n",
"apply_matrix_to_embeddings_dataframe(best_matrix, df)\n"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {
"id": "nLnvABnXpgkS",
"outputId": "0c070faa-6e3e-4765-b082-565c72a609be"
},
"outputs": [
{
"data": {
"application/vnd.plotly.v1+json": {
"config": {
"plotlyServerURL": "https://plot.ly"
},
"data": [
2022-03-11 02:08:53 +00:00
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=1<br>dataset=train<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
2022-03-11 02:08:53 +00:00
"legendgroup": "1",
"marker": {
"color": "#636efa",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "1",
"offsetgroup": "1",
"orientation": "v",
"showlegend": true,
2022-03-11 02:08:53 +00:00
"type": "histogram",
"x": [
0.9267355922090726,
0.8959824210865295,
0.911972591898887,
0.854066984904447,
0.892887342538514,
0.9197115504102283,
0.86454296137645,
0.8314164148734599,
0.724331390174805,
0.8819971496348794,
0.7956215054013406,
0.7959481828066851,
0.8682525486487739,
2022-03-11 02:08:53 +00:00
0.8973704559214578,
0.8648042598103035,
0.9236698983952911,
0.9834804743408886,
0.8152447417624246,
0.82517200338841,
0.8138195591908199,
0.804188586062905,
0.9329690881323882,
0.9560346902836805,
0.9727875564710335,
0.8739787475357144,
0.8208200931608043,
0.7246913155327134,
0.9324916311845146,
0.8285737168086551,
0.8797008553699697,
0.8203332150276859,
0.9370111006544561,
0.8983827482700403,
0.8312111261703522,
0.8164052526562986,
0.89081486465724,
0.7466264016350165,
0.7496519642865328,
0.8737558267185661,
0.7849398152833806,
0.8309506411877995,
0.930721217634791,
0.8281747402318884,
0.9529528463964135,
0.78286620810114,
0.8871009561039284,
0.9000278355775503,
0.8805448754876422,
0.9303377269239715,
0.880195490304124,
0.8529206894100387,
2022-03-11 02:08:53 +00:00
0.9467797365089127,
0.9503676908767298,
0.7030531845036039,
0.8643992383828719,
0.8536886653620115,
0.9619331110018076,
0.9798279216368141,
0.8545739734233097,
0.8957115038209394,
0.8241137164789778,
0.8234984829866299,
0.8936706242503488,
0.8987178415114151,
0.9081806523258728,
0.9208852069309506,
0.8961858080568302,
0.8831329492644463,
0.9282623086728464,
0.8990849222879878,
0.8284548404976377,
0.8202091320216596,
0.8647762708043815,
0.8401369579324562,
0.9887387560741359,
0.8333426560846096,
0.8285331108196707,
2022-03-11 02:08:53 +00:00
0.9118814662694842,
0.8706628935716073,
0.9279786047447278,
0.7389559884393851,
0.8433932042319168,
0.9240307531537069,
0.9507699373739879,
0.8586024439929903,
2022-03-11 02:08:53 +00:00
0.8685107123188051,
0.8755350362634888,
0.9894909158977805,
0.8279650076349706,
0.9108703736855251,
0.9090161902538356,
0.8603952587890591,
0.7791958177087142,
0.8800175081701747,
0.8442387838852023,
0.7672266109003523,
0.9379753909220483,
0.8637536217766965,
0.9190295692184896,
0.8137487445889441,
0.913488637289173,
0.8043760077432088,
0.879230049130692,
0.8716796299186113,
0.8669146720822511,
0.7736224662910375,
0.9439048564746342,
0.905686329549249,
0.9534823417127044,
0.9150626364280348,
0.9409873575925382,
0.8111212514948384,
0.9171209894364517,
0.9126582215678652,
0.8337042978731589,
0.7317859049265332,
0.8444929456896246,
0.8561920423978694,
0.7765276737312753,
0.8526116780064548,
0.9178549175037264,
0.9238337663325366,
0.7218806787029511,
0.8180162425905607,
0.9687438996846139,
0.8354559170776137,
0.9146160669265362,
0.808210346566899,
0.9563444959106976,
0.9066029888020705,
0.8485102489128452,
0.8154210964137394,
0.8862154929899421,
0.9280705424664027,
0.9835182438283631,
0.9653797794869796,
0.7815047664005954,
0.7156150759652161,
0.9256075945052357,
0.8135073899611842,
0.9655015183317774,
0.8222681606077051,
0.9072121875352692,
0.8611990749314697,
0.9075083706276883,
0.9452697088507865,
0.8792221642490844,
0.9261547231888615,
0.8628843091591882,
0.7825774678762871,
0.8265878682590281,
0.739969794812712,
0.7855475190052562,
0.9111614048749492,
0.8871057099019782,
0.8824403180046967,
0.8618250241358769,
0.9787037899591454,
0.8066230593744526,
0.8276910929485197,
0.9246432065475546,
0.8840853147036714,
0.7864843506607968,
0.9106863309457174,
0.9342800777751461,
0.8573335076933394,
0.7780871068368611,
0.7913314671023687,
0.8574377397654754,
0.9078366094992862,
0.752973927739965,
0.8630106574340903,
0.9051526765387132,
0.7715467460924421,
0.8941465881092564,
0.8095881925341774,
0.7733578297403775,
0.7600408383615723,
0.7819972023010567,
0.9003461723046663,
2022-03-11 02:08:53 +00:00
0.742804802462531,
0.8645936494952892,
0.8158769876746998,
0.8338827591801034,
0.8272653957842918,
0.9017517383025067,
0.8480852031381642,
0.7970818327030217,
0.8483706700151505,
0.9272909957177218,
0.9511439768573109,
0.8796630928594475,
0.8297595345126891,
0.8132311692835352,
2022-03-11 02:08:53 +00:00
0.8460965104145681,
0.8787645382723887,
0.8591367321478075,
0.8452813271088438,
0.7081208529169517,
0.8769677227983257,
0.9576216492651992,
0.7463356296909661,
0.8618039394725079,
0.9560112448844987,
0.8478374741588728,
0.769289016610608,
0.8458585917175788,
0.9014601942019844,
2022-03-11 02:08:53 +00:00
0.8816990618751593,
0.8836365020988086,
0.8078009752591794,
0.8984716696273352,
0.9064470720437559,
0.8762712604989469,
0.9178852324400089,
0.7896235961898858,
0.8939345730555539,
0.9534018416101309,
0.8358942065066962,
0.948865711109057,
0.9046799884368947,
0.7583576539746958,
0.9080459944470666,
0.7709722699637687,
2022-03-11 02:08:53 +00:00
0.963551247793185,
0.9792712669973792,
0.8526700752964347,
0.827813310501214,
0.9735858612930184,
0.7212301964264753,
0.8257425306850711,
2022-03-11 02:08:53 +00:00
0.924320548123444,
0.9183796450934556,
0.9029146930594939,
0.9410246041287362,
0.9609604037240548,
0.7467407977088399,
0.8831901227140917,
0.8173287201360423,
0.8067347035873811,
0.7921957440752069,
0.9110994798640996,
0.8678737504816454,
0.91177432256281,
0.7812564975232954,
0.8553931177741548,
0.8798565771781157,
0.8485358177151634,
0.7748765500469469,
0.9432062978626803,
0.8328320715664294,
0.798362976362054,
0.9345589971516312,
0.7800346997026738,
0.9894680324717378,
0.8239308908293631,
0.8236003487600682,
0.8346101071823683,
0.8273793498951607,
0.7872103659197973,
0.9502897886350955,
0.8330663046259037,
0.934656824021464,
0.8082083574312163,
0.8920672691284423,
0.8566523142422968,
0.7636170839305908,
0.8271048233812095,
2022-03-11 02:08:53 +00:00
0.8450776680779332,
0.9045266242453643,
0.8578964048993004,
0.8673866120865574,
0.8804224183254911,
0.8199459541564516,
0.9324100333449752,
0.9096821257786284,
0.8658255623901577,
0.9386720382389069,
0.8517830108211426,
0.8894337360140997,
0.9788475938303791,
0.8369738176471242,
0.8438616298356066,
2022-03-11 02:08:53 +00:00
0.9457050131096572,
0.8699723457920832,
0.7795221422725261,
0.9136284838226408,
0.8394610380643428,
0.9453279812604809,
0.7899532079576974,
0.9078373592832483,
0.8434980565725266,
0.8112068695892253,
0.9466506417952321,
0.931413666521914,
0.7932453739077451,
0.8205411410996694,
0.9243834389749737,
0.7196162090749076,
0.7552985097607482,
0.9593440980269001,
0.9175579371411101,
0.8643861904380715,
2022-03-11 02:08:53 +00:00
0.8315201131392358,
0.7608819740667967,
0.9704324556248521,
0.8037085296495649,
0.7785353984256803,
0.8044961880185003,
0.8313307508528462,
0.8064106355318161,
0.9291149178587121,
0.8412940943665776,
0.6917091092254815,
0.8952044326369335,
0.818225072265956,
0.8645847235619342,
0.8532020278604288,
0.8143634599177915,
0.8829012215420231,
0.7764652540281851,
0.8500993692007114,
0.8616919094128496,
0.9257293988684876,
0.935772204981356,
0.774265719975256,
0.789871006952492,
0.8590438495949997,
0.9317809675958327,
0.9087109945316316,
0.9492979985891563,
0.8813316522495983,
0.737208140494784,
0.8838176414418067
],
"xaxis": "x2",
"yaxis": "y2"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=1<br>dataset=test<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "1",
"marker": {
"color": "#636efa",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "1",
"offsetgroup": "1",
"orientation": "v",
"showlegend": false,
"type": "histogram",
"x": [
2022-03-11 02:08:53 +00:00
0.9424796846788046,
0.9078956616062651,
0.8334324869405139,
0.9352180100721489,
0.9055462990278683,
0.8981939713362292,
0.8310153265298836,
0.8504676065056102,
0.8456281890127811,
0.8845204605513738,
0.9575409744952922,
0.8867362111321382,
0.8268148049027775,
0.9197424492086052,
0.7868932882211557,
0.7584994078201337,
0.9184151112777117,
0.8634069824306613,
0.8347803692078435,
0.8293627321978324,
0.9290633376090963,
0.8385821685601387,
0.9389267225654604,
0.8908184420511278,
0.8663476047908254,
0.8406483287589527,
0.8084243400296846,
2022-03-11 02:08:53 +00:00
0.8909500802168062,
0.9262896014538773,
0.8955541227032415,
0.8055268516127605,
0.7586268193375352,
0.9609058493434491,
0.9149590584369259,
0.8670137150023248,
0.8813831596952219,
0.860225515397,
0.9239960993694921,
0.9173221779567197,
0.8037285375166193,
0.9196033586084531,
0.8179495005725935,
0.9015423000521007,
0.9054394611244669,
0.9309412938014421,
2022-03-11 02:08:53 +00:00
0.9421722896767072,
0.7632823193304991,
0.8622055681944147,
0.9855273112832761,
0.914415556703985,
0.9160573926361296,
0.8027504541651594,
0.7131090046615766,
0.8617419486109846,
0.98287317120162,
0.8100227524488052,
0.892387860418092,
0.809664342563128,
0.8707613725090536,
0.8786740135792194,
2022-03-11 02:08:53 +00:00
0.827463989695164,
0.8927098766437765,
0.9565597072685753,
0.9060728094488207,
0.7383075176406174,
0.9645943656943117,
0.8755564011198428,
2022-03-11 02:08:53 +00:00
0.879644342835206,
0.8679709662655806,
0.9304235140233539,
0.8902804954107686,
0.874836956726809,
1,
0.7979398160311217,
0.8182553476855959,
0.7782108664889419,
0.8427610541278799,
0.8696408841463731,
0.8747903021226509,
0.9149733683476413,
0.9651568967676807,
0.977554798666313,
0.8964005890099545,
0.8689760342800351,
0.8501707280841363,
0.9069421093108844,
0.7682621581806748,
0.9658683145893564,
2022-03-11 02:08:53 +00:00
0.8946443490839046,
0.7855154288057422,
0.8963791538152951,
0.8062904923128396,
0.8165205974456892,
0.8392522239745428,
0.9456080865553905,
0.7904904118155052,
0.8331267917887729,
0.7852156048607353,
0.7859162372602091,
0.90976749903987,
0.8868692158735381,
0.9391826646888828,
0.9428151203411792,
0.7923603881082193,
0.9018727187087263,
0.97231619441654,
0.7820369106687125,
0.9667234198836612,
0.9787696268534193,
0.9155729796430734,
0.8273013981821028,
0.960331962375041,
0.929897501248699,
0.8775117472056205,
0.8613342799390303,
0.9144155658413454,
0.7783710778245275,
0.9701880187837707,
0.7858944695167878,
0.9278353488412265,
0.9472367442821338,
0.7834809783164823,
0.7997358970000906,
0.8459052928211823,
0.8612077001477506,
0.8470901718545574,
0.8240372721865142,
0.8656086505509303,
0.8023193245375629,
0.783678884717712,
0.8804041342871782,
0.8491559248265502,
0.7883452708992278,
0.9461393747874567,
0.8351233852567399,
0.8158174033362672,
0.8604581312681885,
0.9623616564166072,
2022-03-11 02:08:53 +00:00
0.856468839580938,
0.8576867667576002,
0.8973905359734362,
0.8634447086393151,
0.8149528594157183,
0.8731712539786042,
0.8653347693348777,
0.9295255577503568,
0.8358267202312724,
0.9718886825986638,
0.8500189244661982,
0.6201715853032974,
0.8982737441192186,
0.8919523976747616,
0.7327218610615461,
0.8329671226232828,
2022-03-11 02:08:53 +00:00
0.9265589852995393,
0.8976605728389208,
0.8865148834725959,
0.7893917266176482,
0.7303107669745307,
0.8428958494374836,
0.8712646527997077,
0.9726111204993027,
0.9368020235357589,
0.9270010845221283,
0.8900608737222808,
0.79751731467271,
0.940330874442756,
0.8484005154341017,
0.9285585486502653,
2022-03-11 02:08:53 +00:00
0.8461714648336822,
0.9301612560985565,
0.9840391345414705,
0.8305503022437543,
0.8985536904301074,
0.9477072571711664,
0.934289266722412,
0.8849523260221185,
0.773662084263725,
0.8083290895710892,
0.9510007702344464,
0.8677438099387293,
0.8324233959729913,
0.7379868665757632,
0.9049462203262157,
0.9044068971508709,
0.7810399091823383,
2022-03-11 02:08:53 +00:00
0.9041769944901107,
0.7720832575605646,
0.7168259247291856,
0.8657076247663684,
0.9689982289113886,
0.9330371342125484,
0.7014093148352947,
0.9056081834465988,
0.8483474406338491,
0.8729108893579319,
0.8494252832990817,
0.8702668024360607,
0.8703072657352607,
0.9279473627134431,
0.8615930019969985,
0.7590822858582416,
0.8435232133017242,
0.8264379729550373,
0.8793126203874563,
0.8474523011181411,
0.7546334362798065,
0.8870818558635253,
0.8349553719953364,
0.923200758907938,
0.7924421886376952,
0.855610314876051,
0.8397958722387048,
0.9358165871780313,
0.9045773532651927,
0.9022537126477369,
0.7756039171534931,
0.9460916193165211,
0.8264119474819362,
0.8261258110555288,
0.8605336601635148,
0.7518422502719879,
0.8495875568327971,
0.992279957461567,
2022-03-11 02:08:53 +00:00
0.7499254098383082,
0.8845204605513738,
0.8361936554147797,
0.9172228811270781,
0.8068135569680097,
0.7957399297673027,
0.8632611459497657,
0.7612462572836113,
0.958912542207282,
0.9555759038520236,
2022-03-11 02:08:53 +00:00
0.8822980111141415,
0.9663740138580926,
0.9071760951682218,
0.933533889331542,
0.8042262160076494,
0.9399607299036465,
0.8318513717574904,
2022-03-11 02:08:53 +00:00
0.8697471261915183,
0.9103391823944785,
0.8272582058280911,
0.7868989551985196,
0.741616891032038,
0.8828593526738941,
0.9141342991713857,
0.7259887482535182,
0.9478299712074272,
0.8437665184157634,
0.9198304263214642,
0.9069062939546915,
0.9036466179892355,
0.9817542892477462,
0.8833292620163823,
0.8325566159927532,
0.8135910430676571,
0.9628932976448151,
0.9450804651757593,
2022-03-11 02:08:53 +00:00
0.9226384097207587,
0.8401818092769459,
0.7236914068799891,
0.6828741129809796,
0.8344105231696747,
0.9959256404068638,
0.9528703966342777,
0.9695146929637602,
0.9220387803870667,
0.9511950111612875,
0.8744220297098892,
0.8399026052955197,
0.9029483760093544,
0.9097073428234548,
0.8651925589034414,
0.9178332688200683,
0.7556713750040486,
0.8601740878617401,
0.8250804248322693,
0.7994733073162199,
2022-03-11 02:08:53 +00:00
0.8911389632926229,
0.9159137771752827,
0.7867422038306616,
0.8035375125861887,
0.7702882646822419,
0.9060460436592801,
0.7214029227404364,
0.8607904816523709,
0.8228468627026362,
0.8900020170242702,
0.9343567733995704,
0.9305049273825277,
0.9664193138851035,
0.9008537856737299,
0.7625840736444573,
0.8153020546259354,
0.9215061720116507,
0.7192673780176765,
0.8949994062319516,
0.936756654753208,
0.7602684168515255,
0.8184439768344212,
0.8361983865246644,
0.7761725471031594,
0.7724780963721255,
0.9249211342441499,
0.8718843142394451,
0.8522890338443532,
0.9015475856777736,
0.8720699804712655,
0.8937599375974886,
0.8721713576430158,
0.8100783165392635,
1,
0.8213222547688209,
0.8361185411078411,
0.8371907462164929,
0.9065697379059939,
0.7522406715066838,
0.828307889290731,
0.8499886821303806,
0.9097932363997518,
0.9529813102433097,
2022-03-11 02:08:53 +00:00
0.8449289750216329,
1,
0.8302949354181002,
0.7741532048489975,
0.8743828041850981,
0.8201855611976102,
0.8194689754558628,
2022-03-11 02:08:53 +00:00
0.792507679596051,
0.8748126109754423,
0.8299510305152616,
0.9619426556959261,
0.8627070028560689
2022-03-11 02:08:53 +00:00
],
"xaxis": "x",
"yaxis": "y"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=-1<br>dataset=train<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "-1",
"marker": {
"color": "#EF553B",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "-1",
"offsetgroup": "-1",
"orientation": "v",
"showlegend": true,
"type": "histogram",
"x": [
0.6957767388290562,
0.7579420785485997,
0.6956346694277136,
0.7076445232623223,
0.7807306734457936,
0.7139655931983309,
0.7482502662331184,
0.6369069500843659,
0.6658508493766556,
0.6504235710140688,
0.7192157983049555,
0.8018422916166743,
0.7177609439190309,
0.7101045697103688,
0.6571780241576142,
0.7680272436762176,
0.7234850964131593,
0.7152898708694893,
0.9501355018154564,
0.6962127827942286,
0.7684203207980727,
0.6855948971499829,
0.765605006674216,
0.7443232402476876,
0.7041761758241216,
0.8326497880219594,
0.7441778029675181,
0.7008454983313879,
0.7537693597675987,
0.7604977355277929,
0.6549109506960609,
0.7436588048190768,
0.736629241812364,
0.7186606117959837,
0.743294708716713,
0.7984912579429615,
0.8887305476049442,
0.6889141161032268,
0.7456194127120711,
0.6994897408446203,
0.7583012514203751,
0.7085664340222914,
0.6923652600780655,
0.7408748794305987,
0.711424171061503,
0.6311728944545804,
0.6777055537053459,
0.7255800325087469,
0.648567491821295,
0.6743371742523807,
0.8018549612389473,
0.7894137282256273,
0.7128392177932913,
0.7188183914165817,
0.7704057820977789,
0.7196836946749967,
0.7360703150646857,
0.6815385098793996,
0.6487310877592406,
0.697329063568009,
0.6597277478579927,
0.8184293996585623,
0.7593483031078324,
0.6532305277723722,
0.7114445030553102,
0.7014456446951195,
0.7153770310598158,
0.7888810617640961,
0.8567450102581987,
0.7352496863055631,
0.7409637400944985,
0.7436246856280054,
0.6776765762765335,
0.7574849242619103,
0.7781943930539938,
0.6705833254898479,
0.6996804955392666,
0.708413777144502,
0.6208481877780442,
0.7631377712831763,
0.7236950904524616,
0.6930659760723458,
0.8197788002483368,
0.7183749308247108,
0.708279940474656,
0.6716302691422836,
0.7227227641784396,
0.7176967428668025,
0.6344405133500717,
0.7347697535407555,
0.6137191056434405,
0.704325807968194,
0.682901324910752,
0.740582042072576,
0.8504604982454053,
0.6576669009983956,
0.7377992639136004,
0.6768728894788933,
0.7390305317492698,
0.6779588884193068,
0.7010958025289625,
0.6808620853852188,
0.7342685181593549,
0.7247058033450888,
0.6666900634685694,
0.7249360881820055,
0.6821873098905429,
0.8266266825008808,
0.7802529245761647,
0.74664178598042,
0.7353986890974347,
0.7470097879184785,
0.6901197315623567,
0.7382066089989482,
0.6589753496564349,
0.8008982879808738,
0.7168379312540042,
0.7521414198277515,
0.6953320433269816,
0.8073035301459459,
0.8027947050396425,
0.7043228507137288,
0.7231539855105249,
0.7383263907454282,
0.7576679699471423,
0.663538429030986,
0.6595808033434192,
0.7882393615703649,
0.7930397325325561,
0.7350673331350056,
0.7353002557889453,
0.5974027090802823,
0.71498635419546,
0.7622099782316335,
0.7391795687415161,
0.6675464131038639,
0.7154656258748834,
0.7437529076201842,
0.6211596349938211,
0.7188358964678377,
0.849937090645506,
0.7095529861328113,
0.7406039820475517,
0.8252001843610418,
0.6600510705360498,
0.8210357263546099,
0.6929119099897709,
0.7213879617650821,
0.75916409837913,
0.7427633178667192,
0.7552302281942865,
0.7063101613043915,
0.750297544678819,
0.7543911574920913,
0.6966625389622579,
0.7144451621430684,
0.6869972931308058,
0.677188890003343,
0.9156123844751002,
0.6553838580181919,
0.7216969692662536,
0.7057076998477817,
0.7288073281835964,
0.6725923362897323,
0.7143254559469274,
0.7518074207615209,
0.8335512685134062,
0.7759626794704192,
0.8856844683478049,
0.7041490384164095,
0.7165191930620582,
0.7135550060411538,
0.7912155955788918,
0.7551034045584042,
0.6986803166472683,
0.754824217875114,
0.7302660247050152,
0.7292630043525107,
0.571656914618242,
0.6698699469374744,
0.7984652233767994,
0.7727344176188279,
0.8009444873466395,
0.7941470756172919,
0.7652444284493098,
0.6741336809466748,
0.7539180675581941,
0.8697425458622727,
0.6918630406009729,
0.7489767304058851,
0.736384242889192,
0.7813447997158885,
0.7171120058842186,
0.7750322398349039,
0.8005281011285704,
0.7211245778376837,
0.7673632996245495,
0.7783697594140624,
0.7266884553674846,
0.6756399302317858,
0.6556438756089052,
0.7128589961833791,
0.7581993529758215,
0.6609844666486506,
0.7097778453330825,
0.7400669997825543,
0.8194161734102894,
0.6364360529422222,
0.7037773962740846,
0.8696228178354303,
0.7345582917241741,
0.643112997052813,
0.7610320975809615,
0.753687475759657,
0.7468750326384852,
0.7035578981973345,
0.7310425504599337,
0.7897991836049737,
0.7245144286237929,
0.6316741098779987,
0.7012328824900284,
0.5900419819184002,
0.8182166949314503,
0.7228756964171011,
0.6440790377007117,
0.7668502479739949,
0.7994625786967952,
0.6916093734832223,
0.654390476398534,
0.7686898144257487,
0.6858388736601835,
0.7414441464634663,
0.7167581644002714,
0.7534962658828813,
0.7785786109741161,
0.7709302585561065,
0.8217756641797949,
0.7467392332137058,
0.81777684562388,
0.7473614292639223,
0.7611878839275079,
0.7117761650105634,
0.7085496761694915,
0.7124153888060497,
0.7010151710136044,
0.7869222008593535,
0.711698947348409,
0.7471849590974714,
0.7444455266265991,
0.7109919785851376,
0.6761384312121954,
0.7069524601995654,
0.6769302705298076,
0.6600007718102588,
0.7134572349000202,
0.6776847453414893,
0.7701252174850034,
0.7619024930673666,
0.7350124211266063,
0.7282874850408689,
0.7150121553663806,
0.7686694919390304,
0.710804915025901,
0.754091001033791,
0.659766025750913,
0.7343426894879505,
0.6882973972778424,
0.7017239067139008,
0.6961212586520296,
0.68280329952847,
0.7474987582590388,
0.7024630975520628,
0.6982331978086047,
0.6592616699060925,
0.6607786597175946,
0.6330185581995625,
0.6492017860797994,
0.7432473772469652,
0.6808312235563921,
0.7015910481109136,
0.7295204842133106,
0.727575624013537,
0.6290966993880249,
0.8097507802881175,
0.6950500355035831,
0.7610194730389668,
0.7653203010902764,
0.7930178316838892,
0.734057647575713,
0.6882673397419932,
0.7068002726242354,
0.793009415598764,
0.643454861462673,
0.676368701856204,
0.7430550382270783,
0.7719090768075099,
0.6580654026532401,
0.7428410268158974,
0.7592748841815049,
0.7695088759512342,
0.702181066664115,
0.7711052428118305,
0.7434207560001076,
0.7404961692611163,
0.7318114780227037,
0.7461892166750458,
0.7733820478206747,
0.6616145428467011,
0.7695717007796302,
0.7980215897653987,
0.747678639463431,
0.7509209450577673,
0.6585229935121563,
0.737340279891889,
0.7299769809214144,
0.6643481197819185,
0.7141036136948552,
0.7186136947767433,
0.7683181698119435,
0.7266528742664866,
0.7207220525925799,
0.7262594771454401,
0.699484819588429,
0.7087768808001038,
0.7031346970669167,
0.7759793525469474,
0.6645736829343425,
0.7238090745888058,
0.756500387168653,
0.7212253755892469,
0.9439048564746342,
0.7170936399576909,
0.6684835207627098,
0.7427154430013124
2022-03-11 02:08:53 +00:00
],
"xaxis": "x2",
"yaxis": "y2"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=-1<br>dataset=test<br>cosine_similarity=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "-1",
"marker": {
"color": "#EF553B",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "-1",
"offsetgroup": "-1",
"orientation": "v",
"showlegend": false,
"type": "histogram",
"x": [
0.6512761063171582,
0.7287342850883989,
0.7577025876749072,
0.7592639273035047,
0.6818876474384771,
0.7152573447252137,
0.774350664396305,
0.6550472488733596,
0.7738981884615284,
0.7541243234162924,
0.7519536763883761,
0.8320210261974733,
0.7426518936353101,
0.7265979080155922,
0.890966113872019,
0.7353818553266447,
0.7634921579881796,
0.8294904864250581,
0.8137172697009756,
0.8420971446336485,
0.6893957895813114,
0.7413429532184015,
0.7582374198607541,
0.7828280012313105,
0.8391627159272224,
0.7217988885724745,
0.7162480084188481,
0.6981259451671704,
0.6588040830433353,
0.7549114753010229,
0.7674729370748693,
0.7677466078268333,
0.7813055362210076,
0.825251692431862,
0.6297959602274461,
0.7641061587072203,
0.7753469951347335,
0.674860172370046,
0.7938148378683839,
0.8198431171016068,
0.7595090110974859,
0.7507352656520713,
0.721289644501111,
0.741448088675425,
0.7510460463679841,
0.6800620536804655,
0.7915508451351532,
0.7535140620444722,
0.697352781045208,
0.6921955653816687,
0.7381201713883649,
0.7590104632379403,
0.7947524562852025,
0.7597505011395934,
0.7777522657386521,
0.7939587696382155,
0.7070668790582435,
0.840089831842377,
0.6880689281216312,
0.7677769805308678,
0.7233632378806841,
0.7961527561562991,
0.6980444937348311,
0.75390648995092,
0.6749693686864726,
0.7568808238483822,
0.7172913226413175,
0.7473527225327126,
0.8457406370143061,
0.8437228274714977,
0.6958621694379827,
0.7765377372889585,
0.6973343891702213,
0.7756899803088712,
0.6635907810675821,
0.6488560596672128,
0.7298030259010234,
0.6846443367035591,
0.7001117775348583,
0.6966337253874092,
0.7232555055076415,
0.757474542932862,
0.6721150769416768,
0.7386670808176127,
0.6698897461312842,
0.7604099701068538,
0.6751748245607633,
0.6911357942165284,
0.7475988151834796,
0.7250925137024565,
0.7682822086780725,
0.6001504764928414,
0.7286482604645207,
0.7850979345046225,
0.7489292588984857,
0.7863999187035979,
0.7441989398358548,
0.6878769037784354,
0.7087455880321216,
0.7244528847914316,
0.7574691637677696,
0.776178673818935,
0.6687352373386877,
0.6726942377442499,
0.8771278312509985,
0.7276308021233041,
0.7367105247576413,
0.724957581832504,
0.734476946987388,
0.7037027676285385,
0.7196719298703599,
0.701205421305578,
0.7456378459685954,
0.7447862922562046,
0.7559499820995036,
0.7413402255446677,
0.7794929969859868,
0.6717172956350257,
0.7260762653292592,
0.753871022836222,
0.6352735205755001,
0.702039186060966,
0.7170572347148828,
0.7691696980107645,
0.728628644727037,
0.7968335420664091,
0.725401078187473,
0.7558589479031697,
0.6981586858386896,
0.7279514633434744,
0.8014665326776104,
0.7785624955253924,
0.8220831227737969,
0.6529629264385312,
0.8110829165536457,
0.7278873922826562,
0.7178469882442876,
0.7530896449767179,
0.772948903859944,
0.7575907169850528,
0.7082766658229158,
0.7226745075155961,
0.7018628141104125,
0.7235102040468365,
0.7564796924870556,
0.6510823792096195,
0.7744862914197582,
0.7086084098502505,
0.7296300991380639,
0.7173922813079803,
0.7516377265688725,
0.7508222794968331,
0.7338108329171547,
0.7786666590377729,
0.7828127092439076,
0.7114189502591047,
0.6482639628070281,
0.6457180925320146,
0.7522849702874722,
0.7676087704573854,
0.7021592330299424,
0.7501184007139824,
0.7127376229055441,
0.6135183363691984,
0.7353889483569046,
0.7512335236262094,
0.6589844377556087,
0.8108130845116872,
0.7484309312232497,
0.7157749767777302,
0.6755528085360091,
0.7067158772016874,
0.7682838787038063,
0.694274162009842,
0.7332763277935249,
0.7339484249124498,
0.7540065545888932,
0.7051222601331073,
0.6690000295656567,
0.6574876329908412,
0.763910578747556,
0.7439229694926585,
0.7327668331541545,
0.7218746066334083,
0.7173197798553519,
0.6233098479185598,
0.7265117625117632,
0.7717410161749992,
0.6854051180600711,
0.7647126507746091,
0.6792154479509659,
0.7081674789700858,
0.8082148565794375,
0.6911247759297874,
0.6437947506940874,
0.6959960643233447,
0.6751843420109633,
0.8249849915597032,
0.6037653436571525,
0.7652836594437318,
0.7977485667462958,
0.7534843359114771,
0.7199205406491861,
0.7865275101159649,
0.6417750850147381,
0.7199948511420241,
0.7352368906770536,
0.7534040762347791,
0.7381242671779171,
0.6815418931625805,
0.7633178120388293,
0.821504995270443,
0.7527585939232398,
0.7282115955272953,
0.6643874541733705,
0.7318894661339506,
0.6659948953312396,
0.8150879834179252,
0.6910386690814135,
0.6301591309232464,
0.7939423838462603,
0.628872152793837,
0.7657494007517387,
0.7343479723046831,
0.694662170867962,
0.7348332712061992,
0.7106706446249165,
0.7058771007273967,
0.7747826709165979,
0.7392215059198428,
0.67774434282123,
0.7831071588877654,
0.7554431084313338,
0.7040343241736384,
0.7001366857748546,
0.6847753231240155,
0.6959848301196746,
0.7813554656619699,
0.7150246292072198,
0.6633387291516334,
0.715043596855753,
0.6950135117043482,
0.7069528542592999,
0.6784499436539065,
0.7688234340496489,
0.7241563565971186,
0.7166312309451826,
0.7420523032504507,
0.706267239202893,
0.6442057457506499,
0.6182945197725017,
0.738526489411579,
0.7435080221752528,
0.6599631270531486,
0.669158474503511,
0.7084782747637822,
0.7682136631600044,
0.692764183789182,
0.8343481414223372,
0.6465626343994355,
0.7519945566830063,
0.6901944784591544,
0.6822869701777718,
0.735620343723582,
0.7434270528066679,
0.7530717871104928,
0.7089706943819438,
0.6528323666858076,
0.7703690530748709,
0.7063417690564554,
0.659407045000513,
0.6748263472240738,
0.7539619264384773,
0.7329013588219363,
0.7278093719576061,
0.6798215965900886,
0.7321235458031602,
0.7623326472544149,
0.7106304447454121,
0.7298353283791518,
0.8280081177680023,
0.8487875303968083,
0.7586902498993008,
0.7262160395789158,
0.7571717574914204,
0.7170194298438589,
0.7135185051713953,
0.7651642045562742,
0.8141800133606178,
0.744641171259305,
0.7047019700477195,
0.7210806601313137,
0.7938074864499073,
0.7314762854007714,
0.6527824764542026,
0.7331345189381514,
0.7035321947901184,
0.7512717385292428,
0.6666258593315276,
0.708048262930314,
0.6874076655010792,
0.6668679623890902,
0.6958574562606905,
0.7502669227156041,
0.6436955328817746,
0.6715456271437414,
0.7528740905989604,
0.7462720387872654,
0.7454392784033899,
0.6555255764141893,
0.6879756182494516,
0.7555951795428152,
0.6682754679285435,
0.7466269192055776,
0.6378575070494119,
0.7708276995608967,
0.7218637002364144,
0.8453927000677727,
0.7638858871897566,
0.712437737726452,
0.6873683133158265,
0.7074891731082739,
0.7216305393684207,
0.7618463147488342,
0.7494764318918425,
0.759075131167549,
0.7038415933653567,
0.7075232375166159,
0.7874649158888938,
0.7313010819863205,
0.7350274389477689,
0.7338570735840917,
0.7480964597537426,
0.7399861283611244,
0.6911968351977298
2022-03-11 02:08:53 +00:00
],
"xaxis": "x",
"yaxis": "y"
}
],
"layout": {
"annotations": [
{
"font": {},
"showarrow": false,
"text": "dataset=test",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.2425,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "dataset=train",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.7575000000000001,
"yanchor": "middle",
"yref": "paper"
}
],
"barmode": "overlay",
"legend": {
"title": {
"text": "label"
},
"tracegroupgap": 0
},
"margin": {
"t": 60
},
"template": {
"data": {
"bar": [
{
"error_x": {
"color": "#2a3f5f"
},
"error_y": {
"color": "#2a3f5f"
},
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "bar"
}
],
"barpolar": [
{
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "barpolar"
}
],
"carpet": [
{
"aaxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"baxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"type": "carpet"
}
],
"choropleth": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "choropleth"
}
],
"contour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "contour"
}
],
"contourcarpet": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "contourcarpet"
}
],
"heatmap": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmap"
}
],
"heatmapgl": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmapgl"
}
],
"histogram": [
{
"marker": {
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "histogram"
}
],
"histogram2d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2d"
}
],
"histogram2dcontour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2dcontour"
}
],
"mesh3d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "mesh3d"
}
],
"parcoords": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "parcoords"
}
],
"pie": [
{
"automargin": true,
"type": "pie"
}
],
"scatter": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter"
}
],
"scatter3d": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter3d"
}
],
"scattercarpet": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattercarpet"
}
],
"scattergeo": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergeo"
}
],
"scattergl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergl"
}
],
"scattermapbox": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattermapbox"
}
],
"scatterpolar": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolar"
}
],
"scatterpolargl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolargl"
}
],
"scatterternary": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterternary"
}
],
"surface": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "surface"
}
],
"table": [
{
"cells": {
"fill": {
"color": "#EBF0F8"
},
"line": {
"color": "white"
}
},
"header": {
"fill": {
"color": "#C8D4E3"
},
"line": {
"color": "white"
}
},
"type": "table"
}
]
},
"layout": {
"annotationdefaults": {
"arrowcolor": "#2a3f5f",
"arrowhead": 0,
"arrowwidth": 1
},
"autotypenumbers": "strict",
"coloraxis": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"colorscale": {
"diverging": [
[
0,
"#8e0152"
],
[
0.1,
"#c51b7d"
],
[
0.2,
"#de77ae"
],
[
0.3,
"#f1b6da"
],
[
0.4,
"#fde0ef"
],
[
0.5,
"#f7f7f7"
],
[
0.6,
"#e6f5d0"
],
[
0.7,
"#b8e186"
],
[
0.8,
"#7fbc41"
],
[
0.9,
"#4d9221"
],
[
1,
"#276419"
]
],
"sequential": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"sequentialminus": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
]
},
"colorway": [
"#636efa",
"#EF553B",
"#00cc96",
"#ab63fa",
"#FFA15A",
"#19d3f3",
"#FF6692",
"#B6E880",
"#FF97FF",
"#FECB52"
],
"font": {
"color": "#2a3f5f"
},
"geo": {
"bgcolor": "white",
"lakecolor": "white",
"landcolor": "#E5ECF6",
"showlakes": true,
"showland": true,
"subunitcolor": "white"
},
"hoverlabel": {
"align": "left"
},
"hovermode": "closest",
"mapbox": {
"style": "light"
},
"paper_bgcolor": "white",
"plot_bgcolor": "#E5ECF6",
"polar": {
"angularaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"radialaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"scene": {
"xaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"yaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"zaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
}
},
"shapedefaults": {
"line": {
"color": "#2a3f5f"
}
},
"ternary": {
"aaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"baxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"caxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"title": {
"x": 0.05
},
"xaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
},
"yaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
}
}
},
"width": 500,
"xaxis": {
"anchor": "y",
"domain": [
0,
0.98
],
"title": {
"text": "cosine_similarity"
}
},
"xaxis2": {
"anchor": "y2",
"domain": [
0,
0.98
],
"matches": "x",
"showticklabels": false
},
"yaxis": {
"anchor": "x",
"domain": [
0,
0.485
],
"title": {
"text": "count"
}
},
"yaxis2": {
"anchor": "x2",
"domain": [
0.515,
1
],
"matches": "y",
"title": {
"text": "count"
}
}
}
}
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Test accuracy: 88.5% ± 2.4%\n"
2022-03-11 02:08:53 +00:00
]
},
{
"data": {
"application/vnd.plotly.v1+json": {
"config": {
"plotlyServerURL": "https://plot.ly"
},
"data": [
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=1<br>dataset=train<br>cosine_similarity_custom=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "1",
"marker": {
"color": "#636efa",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "1",
"offsetgroup": "1",
"orientation": "v",
"showlegend": true,
"type": "histogram",
"x": [
0.77248496,
0.7784299,
0.63336056,
0.52814114,
0.68339336,
0.6983657,
0.5284781,
0.3788775,
0.22805688,
0.7589808,
0.51160717,
0.53826344,
0.71505487,
0.6731735,
0.73881686,
0.79921746,
0.95130616,
0.5881616,
0.64278483,
0.44692382,
0.45345563,
0.7182223,
0.85243905,
0.9259887,
0.684955,
0.5581617,
0.32273135,
0.8015765,
0.452338,
0.667066,
0.45639065,
0.7842538,
0.75384325,
0.59760255,
0.60028285,
0.6965371,
0.41186273,
0.41767392,
0.6385952,
0.5971813,
0.65681064,
0.8282272,
0.65823907,
0.84640884,
0.53093946,
0.73511046,
0.713981,
0.65439117,
0.782029,
0.7042085,
0.645937,
0.8509375,
0.87341547,
0.22383259,
0.6640968,
0.5427215,
0.88757217,
0.91241723,
0.51576656,
0.6248116,
0.6026416,
0.56253356,
0.7653189,
0.6296878,
0.719958,
0.8117503,
0.67491907,
0.80204415,
0.7968115,
0.7092085,
0.47931457,
0.57092154,
0.5398638,
0.59935147,
0.9654116,
0.49487326,
0.525844,
0.73307645,
0.78057235,
0.78262895,
0.057300583,
0.5645189,
0.78268164,
0.86234576,
0.63160944,
0.7147478,
0.60258704,
0.9636459,
0.58261365,
0.71542233,
0.8029757,
0.6152953,
0.5399907,
0.75521004,
0.50623894,
0.46605533,
0.832118,
0.52262926,
0.8071625,
0.6394983,
0.81730264,
0.44511,
0.7946228,
0.59091306,
0.5595469,
0.40813738,
0.84121096,
0.7739432,
0.8363627,
0.779532,
0.821954,
0.50575066,
0.698487,
0.76210177,
0.57264173,
0.24336877,
0.46758747,
0.57906824,
0.28787997,
0.66878873,
0.73653793,
0.7945788,
0.31603685,
0.5964392,
0.8927264,
0.611356,
0.7643306,
0.36515507,
0.8379778,
0.67038816,
0.41761935,
0.59509,
0.6935317,
0.8718555,
0.9405072,
0.84945035,
0.38103023,
0.25619453,
0.7988593,
0.62076163,
0.89244556,
0.48064572,
0.7961926,
0.5711337,
0.63520235,
0.8030992,
0.6017814,
0.7796846,
0.59891087,
0.31629696,
0.634774,
0.40599576,
0.49223524,
0.86316425,
0.57009614,
0.75571966,
0.5823424,
0.91155875,
0.47491133,
0.6215925,
0.8091878,
0.62777936,
0.44337013,
0.6718892,
0.79578614,
0.7285537,
0.3637289,
0.36611405,
0.65674233,
0.72641397,
0.24788204,
0.7415271,
0.7175368,
0.026403282,
0.66155183,
0.6367993,
0.3978415,
0.38154712,
0.49615347,
0.76164937,
0.28681353,
0.6180269,
0.44082883,
0.5816052,
0.42241842,
0.75485027,
0.6321553,
0.37196934,
0.63474286,
0.822939,
0.87802756,
0.5355695,
0.63167036,
0.56699127,
0.5522422,
0.68989044,
0.634509,
0.6081131,
0.1948918,
0.7769932,
0.8574976,
0.22530068,
0.7621219,
0.8515166,
0.5242935,
0.32441375,
0.5689761,
0.74622434,
0.72154737,
0.62318337,
0.6367095,
0.6926161,
0.715009,
0.6400703,
0.84062994,
0.5916451,
0.62711394,
0.88403666,
0.5040118,
0.8730072,
0.7379788,
0.50844496,
0.849545,
0.39726606,
0.8747891,
0.8987666,
0.6449226,
0.5780694,
0.8981973,
0.24144614,
0.56592035,
0.73597234,
0.63866633,
0.71725285,
0.8253404,
0.9066825,
0.3927554,
0.5889276,
0.5843413,
0.51398474,
0.20076104,
0.7694918,
0.7081953,
0.7169068,
0.4934773,
0.57596314,
0.742142,
0.6061998,
0.5512241,
0.87977856,
0.5494719,
0.28172454,
0.7589442,
0.4531388,
0.96336895,
0.6005618,
0.35295638,
0.5261942,
0.43217444,
0.49589533,
0.85063046,
0.46265322,
0.8199456,
0.6196507,
0.64309496,
0.6111195,
0.4816848,
0.6172504,
0.7331259,
0.7113977,
0.6870145,
0.4682267,
0.65716565,
0.5441023,
0.8265666,
0.7907503,
0.60990417,
0.829844,
0.56094414,
0.7538857,
0.9424744,
0.5295073,
0.38487643,
0.80851406,
0.5365289,
0.43717736,
0.79474694,
0.51396763,
0.87860847,
0.55010337,
0.7809881,
0.6932891,
0.39742857,
0.8390769,
0.7865644,
0.60314524,
0.48687065,
0.8251649,
0.15959363,
0.39625174,
0.9085188,
0.7674489,
0.60302,
0.52618587,
0.25219148,
0.90128976,
0.58759177,
0.38300294,
0.44799256,
0.6239734,
0.5693853,
0.7915201,
0.663876,
0.26932055,
0.625332,
0.4802549,
0.46194786,
0.5894186,
0.51822287,
0.6966118,
0.23634279,
0.5654215,
0.6291901,
0.7490897,
0.8436436,
0.42465052,
0.51954037,
0.6536923,
0.8509788,
0.79862636,
0.8227171,
0.68281716,
0.1812749,
0.7648928
2022-03-11 02:08:53 +00:00
],
"xaxis": "x2",
"yaxis": "y2"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=1<br>dataset=test<br>cosine_similarity_custom=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "1",
"marker": {
"color": "#636efa",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "1",
"offsetgroup": "1",
"orientation": "v",
"showlegend": false,
"type": "histogram",
"x": [
0.82917607,
0.66134596,
0.51954716,
0.86257166,
0.72628254,
0.750526,
0.4864731,
0.5766724,
0.67553365,
0.70444363,
0.7962275,
0.6828659,
0.62579423,
0.7612651,
0.39220566,
0.23525229,
0.7430274,
0.681206,
0.57100576,
0.4618181,
0.7717482,
0.66903865,
0.81862015,
0.7363578,
0.65482765,
0.5900665,
0.41807476,
0.56522197,
0.79622376,
0.64242285,
0.5257151,
0.35724044,
0.89998907,
0.6990958,
0.6718526,
0.6950704,
0.62698066,
0.843995,
0.71656066,
0.541267,
0.757286,
0.5853929,
0.69428277,
0.66953033,
0.77486753,
0.71624506,
0.44861966,
0.62043154,
0.96741444,
0.71656066,
0.7523478,
0.35968027,
0.23837104,
0.6165728,
0.9381524,
0.65970534,
0.6937098,
0.42048588,
0.7320912,
0.5411735,
0.5515443,
0.7324711,
0.8654217,
0.7345595,
0.120572515,
0.8880181,
0.72959507,
0.64801633,
0.6179143,
0.775555,
0.69528437,
0.6511554,
0.99999994,
0.45763364,
0.4891816,
0.52689016,
0.62137806,
0.46431175,
0.59306586,
0.7417828,
0.9080831,
0.94495535,
0.6842458,
0.6395839,
0.3651511,
0.6974497,
0.4591394,
0.9137486,
0.72217363,
0.33942583,
0.63102245,
0.4121808,
0.50317925,
0.52083516,
0.82563996,
0.5834028,
0.57319915,
0.5267121,
0.430773,
0.67746174,
0.73006845,
0.76708615,
0.84007776,
0.32630995,
0.7430283,
0.93461716,
0.32242122,
0.88632876,
0.93415016,
0.7947591,
0.663094,
0.86432904,
0.81203866,
0.6153065,
0.6051472,
0.769218,
0.2771657,
0.9123674,
0.5813087,
0.83095586,
0.89594924,
0.4161997,
0.39994922,
0.5083682,
0.586647,
0.584445,
0.62957615,
0.6304678,
0.56890905,
0.5099941,
0.73965687,
0.48745942,
0.06789604,
0.88457304,
0.3923879,
0.59776837,
0.57422984,
0.8600533,
0.6060943,
0.6682653,
0.71450746,
0.53604424,
0.5438552,
0.72500855,
0.6166733,
0.7766372,
0.57380986,
0.8721462,
0.5721363,
-0.079339616,
0.7242797,
0.7362908,
0.17497122,
0.4649962,
0.79716206,
0.6340671,
0.6538217,
0.17040826,
0.16764203,
0.645858,
0.72927165,
0.9039947,
0.69287795,
0.6995642,
0.6772414,
0.24469051,
0.7798243,
0.61422795,
0.7830224,
0.4285734,
0.7894828,
0.94119364,
0.6042667,
0.70037603,
0.86497736,
0.7820186,
0.66940856,
0.27114147,
0.48410234,
0.8427682,
0.6319206,
0.40340644,
0.4700949,
0.7175678,
0.63519037,
0.5789876,
0.6412378,
0.3299353,
0.18665552,
0.6760972,
0.9192385,
0.7746668,
0.03245816,
0.70526874,
0.56978154,
0.6314372,
0.7081967,
0.60695934,
0.63611096,
0.7999741,
0.6866334,
0.118245445,
0.67670697,
0.29726192,
0.574974,
0.66640645,
0.23206614,
0.6671905,
0.48326084,
0.75353026,
0.32032862,
0.65064925,
0.6227874,
0.80652577,
0.6265472,
0.58585143,
0.4563265,
0.8286115,
0.59706944,
0.48983166,
0.3551375,
0.5591357,
0.58302546,
0.97374123,
0.39029583,
0.70444363,
0.44744816,
0.8147974,
0.5247821,
0.37105292,
0.49400952,
0.14671415,
0.87963825,
0.88699996,
0.6475237,
0.910521,
0.6595712,
0.8728903,
0.52119327,
0.7909969,
0.6671504,
0.64764446,
0.7278351,
0.46071562,
0.23280819,
0.2017593,
0.6683943,
0.7223231,
0.020465637,
0.856319,
0.5587094,
0.7444295,
0.8034329,
0.77059805,
0.9513212,
0.6630309,
0.68574387,
0.58244973,
0.8876816,
0.8364729,
0.78089803,
0.58695436,
0.32310322,
-0.025134433,
0.64004785,
0.9867934,
0.8780257,
0.92411137,
0.7796486,
0.8637402,
0.6165511,
0.46809453,
0.68845624,
0.79482335,
0.488163,
0.7581433,
0.37998945,
0.56498086,
0.4867107,
0.43123555,
0.735851,
0.7658905,
0.44860497,
0.6289654,
0.35546064,
0.7126528,
0.10628839,
0.6221842,
0.6790467,
0.6917655,
0.8566937,
0.8270255,
0.8787888,
0.74003184,
0.3644137,
0.43054557,
0.73093873,
0.26555946,
0.71534157,
0.7500416,
0.4413064,
0.54119307,
0.46134758,
0.5704466,
0.21199419,
0.7459575,
0.6160228,
0.6868913,
0.7344422,
0.7065232,
0.6571642,
0.7425699,
0.5527032,
0.9999999,
0.6532282,
0.55384946,
0.58197993,
0.67588943,
0.3203788,
0.6354762,
0.72495973,
0.8276395,
0.8291155,
0.5057268,
2022-03-11 02:08:53 +00:00
1,
0.5734691,
0.3646964,
0.6034978,
0.5014944,
0.51001227,
0.3624282,
0.68022674,
0.25739744,
0.87442994,
0.6571843
2022-03-11 02:08:53 +00:00
],
"xaxis": "x",
"yaxis": "y"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=-1<br>dataset=train<br>cosine_similarity_custom=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "-1",
"marker": {
"color": "#EF553B",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "-1",
"offsetgroup": "-1",
"orientation": "v",
"showlegend": true,
"type": "histogram",
"x": [
-0.14631356,
0.2150645,
-0.17721273,
-0.014774359,
0.31493026,
-0.23921429,
-0.016176043,
-0.19633132,
-0.24616908,
-0.32394183,
-0.1970174,
0.20864879,
-0.21525776,
0.07997987,
-0.1294077,
0.14879516,
-0.14550175,
-0.118881874,
0.82065755,
-0.1603219,
0.173046,
0.015787799,
0.04696088,
0.020042073,
-0.006876593,
0.37310848,
0.08655023,
-0.058168516,
0.26778632,
0.15199652,
-0.38437328,
-0.038109127,
-0.12385904,
0.1167721,
0.050700866,
0.30713704,
0.65622365,
-0.09784568,
-0.052613672,
-0.124625646,
0.08843259,
-0.13179678,
-0.13562314,
-0.044120546,
-0.088424474,
-0.25425982,
-0.09659954,
-0.040981703,
-0.19826211,
-0.15191652,
0.23915267,
0.22734964,
-0.10481846,
-0.044474065,
0.22741382,
-0.14752856,
0.01875659,
0.048702266,
-0.34744936,
-0.25667807,
-0.243239,
0.2407964,
0.008137234,
-0.26645315,
-0.07419607,
-0.006743874,
0.0031533076,
0.17131501,
0.574362,
0.02954714,
-0.19186912,
0.069174334,
-0.17349511,
0.2007786,
0.20528132,
-0.37692046,
-0.1295822,
-0.18697645,
-0.40444633,
0.09277144,
-0.063113466,
-0.14736354,
0.29080844,
-0.095374234,
-0.18964,
-0.29373887,
-0.123259164,
-0.013380485,
-0.33879948,
-0.20295757,
-0.3829651,
0.16620393,
-0.19645569,
-0.051244095,
0.33984855,
-0.17247257,
-0.018112766,
-0.21491455,
-0.19354065,
-0.07477518,
-0.24652542,
-0.28667355,
0.074490495,
-0.31614456,
0.07486205,
-0.24968289,
-0.08314688,
0.2901761,
-0.11501841,
-0.001687157,
-0.091639616,
0.055219226,
-0.12197616,
-0.09869877,
-0.22108202,
0.16011831,
-0.060552042,
-0.12809907,
-0.19778465,
0.3850776,
0.15741424,
-0.37466618,
-0.023829255,
-0.06758464,
0.0013383938,
-0.067550294,
-0.032675233,
0.27911675,
-0.1342553,
-0.035469525,
0.0052597383,
-0.22645237,
-0.17526871,
-0.01654997,
-0.008666728,
-0.080432065,
-0.17188518,
-0.07848628,
-0.31877494,
-0.101964645,
0.33627218,
-0.32177982,
0.11209051,
0.3472009,
-0.030815527,
0.20214817,
0.17314893,
-0.15178011,
-0.044413526,
-0.08025528,
-0.036648445,
-0.03373466,
0.009053207,
0.20856336,
0.11053565,
-0.066070445,
-0.29188097,
-0.059765942,
0.7123153,
-0.14245254,
-0.08150609,
0.06566586,
-0.11268348,
-0.134817,
-0.2234768,
0.059913013,
0.35684714,
0.027112674,
0.6119956,
-0.014874908,
0.044963945,
-0.08865894,
0.2662653,
0.03972694,
-0.19578351,
0.1062562,
0.23575628,
-0.108503595,
-0.37158218,
-0.3725603,
0.1547215,
-0.004524674,
0.36374068,
0.32387802,
-0.061987463,
0.079557166,
-0.0769954,
0.40580907,
-0.26079604,
0.046659548,
-0.0038229104,
-0.17485327,
-0.17292248,
0.1535689,
0.3184635,
0.23071532,
-0.2237163,
0.010720961,
-0.0266104,
-0.016314883,
-0.35185197,
0.032475714,
0.21263245,
-0.08914347,
-0.041971862,
0.080931894,
0.27908522,
-0.33167896,
-0.19740541,
0.60109526,
-0.24172406,
-0.18591811,
0.09733032,
0.15689106,
0.15195245,
-0.053635307,
0.047875404,
0.047278233,
-0.09813666,
-0.11950919,
-0.11011585,
-0.4279913,
0.3784185,
0.14334464,
-0.3375014,
0.11212784,
0.1950696,
-0.010580671,
-0.11083246,
0.058871597,
0.02834569,
-0.03875936,
0.06863422,
-0.21860555,
-0.2563525,
-0.19519866,
0.111297436,
0.020913715,
0.29514563,
-0.039783094,
-0.04550997,
0.066707164,
-0.038498834,
-0.1521725,
-0.14355755,
0.050608896,
-0.26269618,
0.055021893,
0.014681513,
-0.1770695,
-0.21011236,
-0.09999081,
-0.27029344,
-0.1711503,
-0.016302368,
-0.15405744,
-0.008672558,
0.07472157,
-0.1829593,
0.25931573,
-0.23437054,
0.076603614,
-0.07636144,
0.2090101,
-0.16502573,
-0.016919438,
-0.24348837,
-0.0046670404,
-0.10531217,
-0.21521884,
-0.24931237,
-0.12921622,
-0.12875667,
-0.20417155,
-0.20525426,
-0.36325544,
-0.3333487,
-0.09185814,
-0.1623453,
0.027464593,
-0.14274749,
-0.010869549,
-0.13943951,
0.3335042,
-0.028699003,
-0.03822171,
-0.030667558,
0.2038631,
-0.07041302,
-0.05917097,
0.21281564,
0.17070198,
-0.15298772,
-0.25131747,
-0.06146742,
0.08504817,
-0.27123213,
-0.15588854,
0.19577187,
0.16853955,
-0.16118304,
0.015012174,
-0.09497487,
-0.13374645,
-0.35322627,
-0.038553353,
0.07657545,
-0.22150058,
0.09306893,
0.23299377,
-0.20772435,
0.12885478,
-0.09416947,
-0.10465764,
-0.049436413,
-0.11989924,
-0.27874103,
0.003650675,
0.2310157,
0.006286244,
-0.07283006,
-0.06107186,
-0.06928166,
-0.05982821,
-0.14942047,
-0.02492789,
-0.13164556,
0.057718668,
0.16157383,
-0.1530251,
0.84121096,
-0.15210995,
-0.20154536,
0.07821582
2022-03-11 02:08:53 +00:00
],
"xaxis": "x2",
"yaxis": "y2"
},
{
"alignmentgroup": "True",
"bingroup": "x",
"hovertemplate": "label=-1<br>dataset=test<br>cosine_similarity_custom=%{x}<br>count=%{y}<extra></extra>",
"legendgroup": "-1",
"marker": {
"color": "#EF553B",
"opacity": 0.5,
"pattern": {
"shape": ""
}
},
"name": "-1",
"offsetgroup": "-1",
"orientation": "v",
"showlegend": false,
"type": "histogram",
"x": [
-0.044232007,
-0.16394736,
0.1632403,
0.30898154,
-0.31152728,
-0.1412171,
0.16544746,
-0.039992098,
-0.06739781,
0.29328457,
0.20841835,
0.39240003,
-0.072083674,
-0.1323954,
0.65390515,
-0.1419817,
0.14207879,
0.37233403,
0.41775197,
0.4477062,
0.007330986,
-0.060661115,
-0.1533373,
0.16154653,
0.30986795,
-0.04987149,
-0.05039994,
-0.3099663,
0.033768795,
0.049615033,
-0.12513095,
0.084499285,
0.014784025,
0.25746813,
-0.2521216,
0.30037472,
0.35965076,
-0.27012432,
0.33456212,
0.46182656,
0.00022935998,
-0.104092345,
-0.015941534,
0.04993073,
0.19007745,
-0.09215645,
0.29843694,
0.17700168,
-0.07530599,
-0.05198914,
-0.01979556,
0.09341776,
0.24662425,
0.08839096,
0.07074861,
0.3306596,
-0.16030385,
0.44936237,
0.076366216,
0.012272965,
0.13835257,
0.3070978,
-0.16659315,
0.14158975,
-0.08432442,
0.0562755,
-0.010829448,
-0.12433197,
0.48499557,
0.22925165,
-0.11255164,
0.05474982,
0.07222052,
0.004476305,
-0.21357912,
-0.23112524,
-0.1334593,
-0.064378396,
-0.16469488,
-0.15095755,
-0.08395999,
-0.049123537,
-0.14437404,
-0.12055686,
-0.24628904,
0.17622618,
-0.18585612,
-0.17259908,
0.026403282,
0.10386413,
0.13815571,
-0.20282538,
0.061352704,
0.13641344,
-0.033069834,
0.14682728,
-0.09846398,
-0.1758675,
-0.2361085,
-0.048240636,
0.2346595,
0.18942785,
-0.18750884,
0.06963452,
0.6345119,
-0.17317489,
-0.016473286,
-0.054224994,
-0.18668914,
-0.08300009,
-0.20187493,
-0.14422236,
0.003934787,
0.08670471,
0.027005529,
0.04950772,
0.1901416,
-0.13007763,
-0.099397585,
0.14745389,
-0.0034004387,
-0.11958898,
-0.16305667,
0.2465617,
0.09422206,
0.2216021,
-0.1115703,
0.05818454,
0.0095924195,
0.021805342,
0.5152118,
0.17758283,
0.45484412,
-0.18031012,
0.23781359,
-0.2723197,
-0.120131016,
0.3089557,
-0.07610359,
0.0668349,
-0.049481135,
-0.083503336,
0.11382353,
0.022735275,
0.008745655,
-0.36504245,
-0.07519976,
-0.19438383,
-0.030094014,
0.029862244,
-0.20987879,
0.033771046,
-0.029448012,
0.15654777,
0.115399696,
-0.21117873,
0.023004135,
-0.072446875,
-0.0062482357,
0.13182724,
-0.036617856,
-0.07087057,
0.026369914,
-0.24964777,
-0.038678735,
0.22372119,
-0.13267735,
0.31100368,
0.104494356,
-0.10890534,
-0.117819,
-0.046191424,
0.20686205,
-0.20084614,
-0.057795662,
0.12906538,
-0.10219019,
0.100868136,
-0.12072054,
-0.17850254,
0.03944966,
-0.021106033,
0.2326171,
-0.046452984,
-0.13646394,
-0.3492189,
-0.012646858,
0.11937844,
-0.0064185136,
0.12207642,
0.090352826,
0.009849305,
0.24756432,
-0.25596198,
-0.30309102,
-0.15054822,
0.039494857,
0.36447075,
-0.09793959,
0.0013159337,
0.41358683,
-0.014106633,
-0.008594143,
0.027101198,
-0.19814838,
0.038080417,
-0.051209256,
0.25874758,
-0.120011285,
-0.2528509,
0.043007903,
0.3127802,
-0.09420785,
-0.1121926,
-0.0418009,
-0.044757575,
-0.14953983,
0.35707867,
-0.1628269,
-0.20438501,
0.2531824,
-0.20827933,
0.05785267,
0.05551476,
-0.03850203,
-0.13426308,
-0.13823794,
0.019035866,
-0.057939056,
-0.062702164,
-0.18570858,
0.3011003,
0.074364945,
0.13434608,
-0.19680634,
-0.1466411,
-0.17824507,
0.18261488,
-0.05813366,
-0.042605266,
-0.060093872,
0.061667304,
-0.03175753,
-0.11111864,
0.087330475,
-0.07445618,
-0.090876184,
0.14992936,
-0.14373888,
-0.099595256,
-0.23506185,
0.20909104,
0.13657185,
-0.26776505,
-0.05170945,
-0.17920311,
0.090262756,
0.04035857,
0.38935044,
-0.19861336,
0.10418857,
-0.051638983,
0.023664067,
0.13819756,
0.069997646,
-0.07436105,
-0.06337012,
-0.20118129,
0.06562445,
-0.05829115,
-0.07231094,
0.12834351,
0.12500349,
0.092681214,
-0.0004535587,
-0.058649164,
0.0845435,
0.084556386,
-0.22690262,
0.013672934,
0.36154422,
0.5564432,
0.07591953,
-0.08061838,
-0.086197026,
0.052395478,
0.026628394,
0.03054079,
0.28305045,
-0.15598606,
-0.18054484,
0.09388513,
0.21947819,
-0.124791116,
-0.37435198,
-0.008636702,
-0.10544233,
0.36299005,
-0.2537855,
-0.094328806,
-0.16789453,
-0.04827591,
0.06105138,
0.048602022,
-0.26636595,
-0.10690215,
0.16656478,
0.20447391,
-0.09940268,
-0.20682369,
-0.010833654,
0.06810745,
-0.13920541,
0.23065639,
-0.1771053,
0.07515774,
0.074341066,
0.27232394,
0.17203307,
-0.04792379,
-0.12490794,
0.17855616,
-0.12948748,
0.17483488,
-0.015064626,
0.27426863,
-0.10310955,
-0.098964475,
0.256601,
-0.025469879,
-0.00063412444,
0.18264629,
-0.10874696,
-0.15044397,
-0.19953947
2022-03-11 02:08:53 +00:00
],
"xaxis": "x",
"yaxis": "y"
}
],
"layout": {
"annotations": [
{
"font": {},
"showarrow": false,
"text": "dataset=test",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.2425,
"yanchor": "middle",
"yref": "paper"
},
{
"font": {},
"showarrow": false,
"text": "dataset=train",
"textangle": 90,
"x": 0.98,
"xanchor": "left",
"xref": "paper",
"y": 0.7575000000000001,
"yanchor": "middle",
"yref": "paper"
}
],
"barmode": "overlay",
"legend": {
"title": {
"text": "label"
},
"tracegroupgap": 0
},
"margin": {
"t": 60
},
"template": {
"data": {
"bar": [
{
"error_x": {
"color": "#2a3f5f"
},
"error_y": {
"color": "#2a3f5f"
},
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "bar"
}
],
"barpolar": [
{
"marker": {
"line": {
"color": "#E5ECF6",
"width": 0.5
},
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "barpolar"
}
],
"carpet": [
{
"aaxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"baxis": {
"endlinecolor": "#2a3f5f",
"gridcolor": "white",
"linecolor": "white",
"minorgridcolor": "white",
"startlinecolor": "#2a3f5f"
},
"type": "carpet"
}
],
"choropleth": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "choropleth"
}
],
"contour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "contour"
}
],
"contourcarpet": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "contourcarpet"
}
],
"heatmap": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmap"
}
],
"heatmapgl": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "heatmapgl"
}
],
"histogram": [
{
"marker": {
"pattern": {
"fillmode": "overlay",
"size": 10,
"solidity": 0.2
}
},
"type": "histogram"
}
],
"histogram2d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2d"
}
],
"histogram2dcontour": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "histogram2dcontour"
}
],
"mesh3d": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"type": "mesh3d"
}
],
"parcoords": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "parcoords"
}
],
"pie": [
{
"automargin": true,
"type": "pie"
}
],
"scatter": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter"
}
],
"scatter3d": [
{
"line": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatter3d"
}
],
"scattercarpet": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattercarpet"
}
],
"scattergeo": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergeo"
}
],
"scattergl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattergl"
}
],
"scattermapbox": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scattermapbox"
}
],
"scatterpolar": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolar"
}
],
"scatterpolargl": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterpolargl"
}
],
"scatterternary": [
{
"marker": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"type": "scatterternary"
}
],
"surface": [
{
"colorbar": {
"outlinewidth": 0,
"ticks": ""
},
"colorscale": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"type": "surface"
}
],
"table": [
{
"cells": {
"fill": {
"color": "#EBF0F8"
},
"line": {
"color": "white"
}
},
"header": {
"fill": {
"color": "#C8D4E3"
},
"line": {
"color": "white"
}
},
"type": "table"
}
]
},
"layout": {
"annotationdefaults": {
"arrowcolor": "#2a3f5f",
"arrowhead": 0,
"arrowwidth": 1
},
"autotypenumbers": "strict",
"coloraxis": {
"colorbar": {
"outlinewidth": 0,
"ticks": ""
}
},
"colorscale": {
"diverging": [
[
0,
"#8e0152"
],
[
0.1,
"#c51b7d"
],
[
0.2,
"#de77ae"
],
[
0.3,
"#f1b6da"
],
[
0.4,
"#fde0ef"
],
[
0.5,
"#f7f7f7"
],
[
0.6,
"#e6f5d0"
],
[
0.7,
"#b8e186"
],
[
0.8,
"#7fbc41"
],
[
0.9,
"#4d9221"
],
[
1,
"#276419"
]
],
"sequential": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
],
"sequentialminus": [
[
0,
"#0d0887"
],
[
0.1111111111111111,
"#46039f"
],
[
0.2222222222222222,
"#7201a8"
],
[
0.3333333333333333,
"#9c179e"
],
[
0.4444444444444444,
"#bd3786"
],
[
0.5555555555555556,
"#d8576b"
],
[
0.6666666666666666,
"#ed7953"
],
[
0.7777777777777778,
"#fb9f3a"
],
[
0.8888888888888888,
"#fdca26"
],
[
1,
"#f0f921"
]
]
},
"colorway": [
"#636efa",
"#EF553B",
"#00cc96",
"#ab63fa",
"#FFA15A",
"#19d3f3",
"#FF6692",
"#B6E880",
"#FF97FF",
"#FECB52"
],
"font": {
"color": "#2a3f5f"
},
"geo": {
"bgcolor": "white",
"lakecolor": "white",
"landcolor": "#E5ECF6",
"showlakes": true,
"showland": true,
"subunitcolor": "white"
},
"hoverlabel": {
"align": "left"
},
"hovermode": "closest",
"mapbox": {
"style": "light"
},
"paper_bgcolor": "white",
"plot_bgcolor": "#E5ECF6",
"polar": {
"angularaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"radialaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"scene": {
"xaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"yaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
},
"zaxis": {
"backgroundcolor": "#E5ECF6",
"gridcolor": "white",
"gridwidth": 2,
"linecolor": "white",
"showbackground": true,
"ticks": "",
"zerolinecolor": "white"
}
},
"shapedefaults": {
"line": {
"color": "#2a3f5f"
}
},
"ternary": {
"aaxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"baxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
},
"bgcolor": "#E5ECF6",
"caxis": {
"gridcolor": "white",
"linecolor": "white",
"ticks": ""
}
},
"title": {
"x": 0.05
},
"xaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
},
"yaxis": {
"automargin": true,
"gridcolor": "white",
"linecolor": "white",
"ticks": "",
"title": {
"standoff": 15
},
"zerolinecolor": "white",
"zerolinewidth": 2
}
}
},
"width": 500,
"xaxis": {
"anchor": "y",
"domain": [
0,
0.98
],
"title": {
"text": "cosine_similarity_custom"
}
},
"xaxis2": {
"anchor": "y2",
"domain": [
0,
0.98
],
"matches": "x",
"showticklabels": false
},
"yaxis": {
"anchor": "x",
"domain": [
0,
0.485
],
"title": {
"text": "count"
}
},
"yaxis2": {
"anchor": "x2",
"domain": [
0.515,
1
],
"matches": "y",
"title": {
"text": "count"
}
}
}
}
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Test accuracy after customization: 93.1% ± 1.9%\n"
2022-03-11 02:08:53 +00:00
]
}
],
"source": [
"# plot similarity distribution BEFORE customization\n",
"px.histogram(\n",
" df,\n",
" x=\"cosine_similarity\",\n",
" color=\"label\",\n",
" barmode=\"overlay\",\n",
" width=500,\n",
" facet_row=\"dataset\",\n",
").show()\n",
"\n",
"test_df = df[df[\"dataset\"] == \"test\"]\n",
2022-03-11 02:08:53 +00:00
"a, se = accuracy_and_se(test_df[\"cosine_similarity\"], test_df[\"label\"])\n",
"print(f\"Test accuracy: {a:0.1%} ± {1.96 * se:0.1%}\")\n",
2022-03-11 02:08:53 +00:00
"\n",
"# plot similarity distribution AFTER customization\n",
"px.histogram(\n",
" df,\n",
" x=\"cosine_similarity_custom\",\n",
" color=\"label\",\n",
" barmode=\"overlay\",\n",
" width=500,\n",
" facet_row=\"dataset\",\n",
").show()\n",
"\n",
"a, se = accuracy_and_se(test_df[\"cosine_similarity_custom\"], test_df[\"label\"])\n",
"print(f\"Test accuracy after customization: {a:0.1%} ± {1.96 * se:0.1%}\")\n"
2022-03-11 02:08:53 +00:00
]
},
{
"cell_type": "code",
"execution_count": 15,
2022-03-11 02:08:53 +00:00
"metadata": {
"id": "XO7iqiVjpgkT",
"outputId": "a100a9e0-d5aa-46ab-b8a7-4ec6f7bd1cec"
},
"outputs": [
{
"data": {
"text/plain": [
"array([[ 0.10089379, -0.26317084, -0.72087 , ..., 0.88050383,\n",
" 1.065715 , 1.0044045 ],\n",
" [-0.79380614, -0.42881328, -1.3020372 , ..., -0.05304366,\n",
" 1.8495051 , 0.21670245],\n",
" [-0.52067024, -2.1359727 , 0.6334456 , ..., 1.2324876 ,\n",
" -1.9171742 , -0.5424233 ],\n",
2022-03-11 02:08:53 +00:00
" ...,\n",
" [-0.13776606, -0.24635942, -1.9609704 , ..., -0.6989217 ,\n",
" 0.05333536, -1.5094401 ],\n",
" [ 0.6410639 , 0.76585424, 1.3761768 , ..., 0.07790121,\n",
" -1.9715555 , -0.04332887],\n",
" [-1.3668009 , 0.7661092 , 0.05144122, ..., 0.1640734 ,\n",
" -0.37696475, -0.19387771]], dtype=float32)"
2022-03-11 02:08:53 +00:00
]
},
"execution_count": 15,
2022-03-11 02:08:53 +00:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"best_matrix # this is what you can multiply your embeddings by\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "dPF-sczmpgkT"
},
"outputs": [],
"source": []
2022-03-11 02:08:53 +00:00
}
],
"metadata": {
"colab": {
"name": "customized_embeddings_example_with_synthetic_negatives.ipynb",
"provenance": []
},
2022-03-11 02:08:53 +00:00
"interpreter": {
"hash": "365536dcbde60510dc9073d6b991cd35db2d9bac356a11f5b64279a5e6708b97"
},
"kernelspec": {
"display_name": "Python 3.9.9 ('openai')",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.9"
},
"orig_nbformat": 4
2022-03-11 02:08:53 +00:00
},
"nbformat": 4,
"nbformat_minor": 0
}