justheuristic commited on
Commit
9a4cc73
1 Parent(s): 795dc75

open in new tab

Browse files
.github/workflows/sync_to_hub.yaml CHANGED
@@ -17,4 +17,4 @@ jobs:
17
  - name: Push to hub
18
  env:
19
  HF_TOKEN: ${{ secrets.HF_TOKEN }}
20
- run: git push https://training-transformers-together:$HF_TOKEN@huggingface.co/spaces/training-transformers-together/demo main --force
 
17
  - name: Push to hub
18
  env:
19
  HF_TOKEN: ${{ secrets.HF_TOKEN }}
20
+ run: git push https://training-transformers-together:$HF_TOKEN@huggingface.co/spaces/training-transformers-together/dashboard-embedded main --force
README.md CHANGED
@@ -1,8 +1,8 @@
1
  ---
2
- title: NeurIPS Demo
3
  emoji: ⚡
4
- colorFrom: green
5
- colorTo: blue
6
  sdk: streamlit
7
  app_file: app.py
8
  pinned: false
 
1
  ---
2
+ title: Mini-dashboard
3
  emoji: ⚡
4
+ colorFrom: gray
5
+ colorTo: gray
6
  sdk: streamlit
7
  app_file: app.py
8
  pinned: false
app.py CHANGED
@@ -5,108 +5,38 @@ If you're not a hedgehog, you shouldn't reuse this code. Use this instead: https
5
 
6
  import streamlit as st
7
 
8
-
9
- from st_helpers import make_header, content_text, content_title, cite, make_footer, make_tabs
10
- from charts import draw_current_progress
11
-
12
- st.set_page_config(page_title="Training Transformers Together", layout="centered")
13
-
14
- st.markdown("## Full demo will be posted here on December 7th!")
15
-
16
- make_header()
17
-
18
- content_text(f"""
19
- There was a time when you could comfortably train state-of-the-art vision and language models at home on your workstation.
20
- The first convolutional neural net to beat ImageNet
21
- ({cite("AlexNet", "https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf")})
22
- was trained for 5-6 days on two gamer-grade GPUs. In contrast, today's TOP-1 ImageNet model
23
- ({cite("CoAtNet", "https://arxiv.org/abs/2106.04803")})
24
- takes 20,000 TPU-v3 days. And things are even worse in the NLP world: training
25
- {cite("GPT‑3", "https://arxiv.org/abs/2005.14165")} on a top-tier server
26
- with 8x A100 would take decades.""")
27
-
28
- content_text(f"""
29
- So, can individual researchers and small labs still train state-of-the-art? Yes we can!
30
- All it takes is for a bunch of us to come together. In fact, we're doing it right now and <b>you're invited to join!</b>
31
- """, vspace_before=12)
32
-
33
- draw_current_progress()
34
-
35
-
36
- content_text(f"""
37
- For this demo we train a model similar to {cite("OpenAI DALL-E", "https://openai.com/blog/dall-e/")},
38
- that is, a transformer "language model" that generates images from text description.
39
- It is trained on {cite("LAION-400M", "https://laion.ai/laion-400-open-dataset/")},
40
- the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on
41
- the {cite("dalle&#8209;pytorch", "https://github.com/lucidrains/DALLE-pytorch")} implementation
42
- by {cite("Phil Wang", "https://github.com/lucidrains")} with a few tweaks to make it communication-efficient.
43
- """, vspace_after=8)
44
-
45
-
46
- with st.expander("How to train efficiently over the internet?"):
47
- content_text(f"""
48
- Modern distributed training algorithms are designed for HPC networks with 10-100 gigabit per second bandwidth.
49
- In turn, a typical Internet connection runs at 10-100 megabits per second: that’s three orders of magnitude slower.
50
- To make distributed training efficient, you need to win back these three orders of magnitude.
51
- This may seem daunting at first, but in reality, DL researchers have already made all the necessary pieces for solving this puzzle:
52
- """)
53
- content_text(f"""
54
- <table style="border: 0px;"><tbody style="border: 0px;">
55
- <tr><td> Speed&#8209;up <br> </td> <td>How to achieve</td></tr>
56
- <tr><td class=centered><strong>4-16x</strong></td><td>
57
- <strong>Large-batch training:</strong> {cite("You et al. (2019)", "https://arxiv.org/abs/1904.00962")} proposed a way for training neural networks efficiently with larger batches, and hence, fewer communication rounds.
58
- </td></tr>
59
- <tr><td class=centered><strong>4-64x</strong></td><td>
60
- <strong>Gradient Compression:</strong> from simple {cite("8-bit quantization", "https://arxiv.org/abs/1511.04561")}
61
- to advanced techniques such as {cite("Deep Gradient Compression", "https://arxiv.org/abs/1712.01887")},
62
- {cite("PowerSGD", "https://arxiv.org/abs/1905.13727")}, {cite("1-bit Adam", "https://arxiv.org/abs/2102.02888")},
63
- and many others. As a rule of thumb, you can safely reduce communication by 16-64x. More extreme compression is often
64
- possible, but it may affect stability or final quality.
65
- </td></tr>
66
- <tr><td class=centered><strong>4-24x</strong></td><td>
67
- <strong>Parameter sharing:</strong> reusing parameters between model layers results in a model with fewer parameters,
68
- and hence, fewer gradients to communicate. {cite("Lan et al. (2019)", "https://arxiv.org/abs/1909.11942")} and
69
- {cite("Xue et al. (2021)", "https://arxiv.org/pdf/2107.11817.pdf")} propose efficient parameter sharing techniques
70
- for NLP and vision.
71
- </td></tr>
72
- <tr><td class=centered><strong>1.5-2x</strong></td><td>
73
- <strong>Overlapping computation with communication:</strong> running network communication in background while
74
- computing the next portion of gradients. This is a {cite("long-standing trick from HPC", "https://ur.booksc.eu/book/1624068/2d0506")}
75
- that was recently adapted for DL training. {cite("Ren et al. (2021)", "https://arxiv.org/abs/2101.06840")} show that
76
- updating parameters in background while computing the next batch of gradients does not reduce convergence.
77
- </td></tr>
78
- </tbody></table>
79
- """)
80
- content_text("""
81
- These techniques are already more than enough to cover 1000x slower communication (48 times over!).
82
- This means that in practice you can pick and choose choose which of them you want in your training run.
83
- For this demo, we use 8x larger batches, 4x compression, 12x parameter sharing and partial overlapping.
84
- If you don’t want parameter sharing, you can trade it for more advanced gradient compression or larger batches.
85
- """)
86
-
87
- content_title("How do I join?")
88
- content_text("To be updated on December 7th")
89
-
90
- # content_text(f"""
91
- # That's easy. First, make sure you're logged in at Hugging Face. If you don't have an account, create one {cite("here", "https://huggingface.co/join")}.<br>
92
- #
93
- # <ul style="text-align: left; list-style-position: inside; margin-top: 12px; margin-left: -24px;">
94
- # <li style="margin-top: 4px;">
95
- # Join our organization on Hugging Face here: <b>TODO</b>. </li>
96
- # <li style="margin-top: 4px;">
97
- # The simplest way to start is with colab <b>TODO</b>;</li>
98
- # <li style="margin-top: 4px;">
99
- # You can find other starter kits, evaluation and inference notebooks <b>TODO IN OUR ORGANIZATION</b>;</li>
100
- # <li style="margin-top: 4px;">
101
- # If you have any issues, <b>TODO DISCORD BADGE</b> </li>
102
- # </ul>
103
- #
104
- # Please note that we currently limit the number of colab participants to <b>TODO</b> to make sure we do not interfere
105
- # with other users. If there are too many active peers, take a look at alternative starter kits here <b>TODO</b>
106
- # """)
107
-
108
- content_title("What happens inside?")
109
-
110
- make_tabs()
111
-
112
- make_footer()
 
5
 
6
  import streamlit as st
7
 
8
+ from dashboard_utils.main_metrics import get_main_metrics
9
+
10
+ st.set_page_config(page_title="Training Transformers Together - Mini-Dashboard", layout="wide")
11
+
12
+
13
+ source = get_main_metrics()
14
+ st.vega_lite_chart(
15
+ source, {
16
+ "height": 200,
17
+ "title": {"text": "Training DALL-E with volunteers", "dy": 7},
18
+ # ^-- WARNING: do not use long titles, otherwise vega collapses on small screens
19
+ "$schema": "https://vega.github.io/schema/vega-lite/v5.json",
20
+ "description": "Current training progress",
21
+ "encoding": {"x": {"field": "wall time", "type": "temporal"}},
22
+ "config": {"axisX": {"labelAngle": -40}},
23
+ "resolve": {"scale": {"y": "independent"}},
24
+ "layer": [
25
+ {
26
+ "mark": {"type": "line", "point": {"tooltip": True, "filled": False, "strokeOpacity": 0},
27
+ "color": "#85A9C5"},
28
+ "encoding": {
29
+ "y": {"field": "training loss", "type": "quantitative", "axis": {"titleColor": "#85A9C5"},
30
+ "scale": {"zero": False}}},
31
+ },
32
+ {
33
+ "mark": {"type": "line", "point": {"tooltip": True, "filled": False, "strokeOpacity": 0.0},
34
+ "color": "#85C5A6", "opacity": 0.5},
35
+ "encoding": {
36
+ "y": {"field": "active participants", "type": "quantitative",
37
+ "axis": {"titleColor": "#85C5A6"}}},
38
+ },
39
+ ],
40
+ },
41
+ use_container_width=True, # breaks on <600px screens
42
+ )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
charts.py DELETED
@@ -1,52 +0,0 @@
1
- import streamlit as st
2
-
3
- from dashboard_utils.bubbles import get_new_bubble_data
4
- from dashboard_utils.main_metrics import get_main_metrics
5
- from streamlit_observable import observable
6
-
7
-
8
- def draw_current_progress():
9
- source = get_main_metrics()
10
- st.title("") # which is actually a backend-agnostic offset
11
- st.vega_lite_chart(
12
- source, {
13
- "height": 200,
14
- "title": "Training DALL-E with volunteers",
15
- # ^-- WARNING: do not use long titles, otherwise vega collapses on small screens
16
- "$schema": "https://vega.github.io/schema/vega-lite/v5.json",
17
- "description": "Current training progress",
18
- "encoding": {"x": {"field": "wall time", "type": "temporal"}},
19
- "config": {"axisX": {"labelAngle": -40}},
20
- "resolve": {"scale": {"y": "independent"}},
21
- "layer": [
22
- {
23
- "mark": {"type": "line", "point": {"tooltip": True, "filled": False, "strokeOpacity": 0},
24
- "color": "#85A9C5"},
25
- "encoding": {
26
- "y": {"field": "training loss", "type": "quantitative", "axis": {"titleColor": "#85A9C5"},
27
- "scale": {"zero": False}}},
28
- },
29
- {
30
- "mark": {"type": "line", "point": {"tooltip": True, "filled": False, "strokeOpacity": 0.0},
31
- "color": "#85C5A6", "opacity": 0.5},
32
- "encoding": {
33
- "y": {"field": "active participants", "type": "quantitative",
34
- "axis": {"titleColor": "#85C5A6"}}},
35
- },
36
- ],
37
- },
38
- use_container_width=True, # breaks on <600px screens
39
- )
40
-
41
-
42
- def draw_participant_bubbles():
43
- with st.expander("Who's training?", expanded=False):
44
- st.markdown("### Collaborative training participants\n(may take a few seconds to load)")
45
-
46
- serialized_data, profiles = get_new_bubble_data()
47
- observable(
48
- "Participants",
49
- notebook="d/9ae236a507f54046", # "@huggingface/participants-bubbles-chart",
50
- targets=["c_noaws"],
51
- redefine={"serializedData": serialized_data, "profileSimple": profiles},
52
- )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
dashboard_utils/bubbles.py DELETED
@@ -1,140 +0,0 @@
1
- import datetime
2
- from concurrent.futures import as_completed
3
- from urllib import parse
4
-
5
- import streamlit as st
6
- import wandb
7
- from requests_futures.sessions import FuturesSession
8
-
9
- from dashboard_utils.time_tracker import _log, simple_time_tracker
10
-
11
- URL_QUICKSEARCH = "https://huggingface.co/api/quicksearch?"
12
- WANDB_REPO = "learning-at-home/Worker_logs"
13
- CACHE_TTL = 600
14
-
15
-
16
- @st.cache(ttl=CACHE_TTL)
17
- @simple_time_tracker(_log)
18
- def get_new_bubble_data():
19
- serialized_data_points, latest_timestamp = get_serialized_data_points()
20
- serialized_data = get_serialized_data(serialized_data_points, latest_timestamp)
21
-
22
- usernames = []
23
- for item in serialized_data["points"][0]:
24
- usernames.append(item["profileId"])
25
-
26
- profiles = get_profiles(usernames)
27
-
28
- return serialized_data, profiles
29
-
30
-
31
- @st.cache(ttl=CACHE_TTL)
32
- @simple_time_tracker(_log)
33
- def get_profiles(usernames):
34
- profiles = []
35
- with FuturesSession() as session:
36
- futures = []
37
- for username in usernames:
38
- future = session.get(URL_QUICKSEARCH + parse.urlencode({"type": "user", "q": username}))
39
- future.username = username
40
- futures.append(future)
41
- for future in as_completed(futures):
42
- resp = future.result()
43
- username = future.username
44
- response = resp.json()
45
- avatarUrl = None
46
- if response["users"]:
47
- for user_candidate in response["users"]:
48
- if user_candidate["user"] == username:
49
- avatarUrl = response["users"][0]["avatarUrl"]
50
- break
51
- if not avatarUrl:
52
- avatarUrl = "/avatars/57584cb934354663ac65baa04e6829bf.svg"
53
-
54
- if avatarUrl.startswith("/avatars/"):
55
- avatarUrl = f"https://huggingface.co{avatarUrl}"
56
-
57
- profiles.append(
58
- {"id": username, "name": username, "src": avatarUrl, "url": f"https://huggingface.co/{username}"}
59
- )
60
- return profiles
61
-
62
-
63
- @st.cache(ttl=CACHE_TTL)
64
- @simple_time_tracker(_log)
65
- def get_serialized_data_points():
66
-
67
- api = wandb.Api()
68
- runs = api.runs(WANDB_REPO)
69
-
70
- serialized_data_points = {}
71
- latest_timestamp = None
72
- for run in runs:
73
- run_summary = run.summary._json_dict
74
- run_name = run.name
75
-
76
- if run_name in serialized_data_points:
77
- if "_timestamp" in run_summary and "_step" in run_summary:
78
- timestamp = run_summary["_timestamp"]
79
- serialized_data_points[run_name]["Runs"].append(
80
- {
81
- "batches": run_summary["_step"],
82
- "runtime": run_summary["_runtime"],
83
- "loss": run_summary["train/loss"],
84
- "velocity": run_summary["_step"] / run_summary["_runtime"],
85
- "date": datetime.datetime.utcfromtimestamp(timestamp),
86
- }
87
- )
88
- if not latest_timestamp or timestamp > latest_timestamp:
89
- latest_timestamp = timestamp
90
- else:
91
- if "_timestamp" in run_summary and "_step" in run_summary:
92
- timestamp = run_summary["_timestamp"]
93
- serialized_data_points[run_name] = {
94
- "profileId": run_name,
95
- "Runs": [
96
- {
97
- "batches": run_summary["_step"],
98
- "runtime": run_summary["_runtime"],
99
- "loss": run_summary["train/loss"],
100
- "velocity": run_summary["_step"] / run_summary["_runtime"],
101
- "date": datetime.datetime.utcfromtimestamp(timestamp),
102
- }
103
- ],
104
- }
105
- if not latest_timestamp or timestamp > latest_timestamp:
106
- latest_timestamp = timestamp
107
- latest_timestamp = datetime.datetime.utcfromtimestamp(latest_timestamp)
108
- return serialized_data_points, latest_timestamp
109
-
110
-
111
- @st.cache(ttl=CACHE_TTL)
112
- @simple_time_tracker(_log)
113
- def get_serialized_data(serialized_data_points, latest_timestamp):
114
- serialized_data_points_v2 = []
115
- max_velocity = 1
116
- for run_name, serialized_data_point in serialized_data_points.items():
117
- activeRuns = []
118
- loss = 0
119
- runtime = 0
120
- batches = 0
121
- velocity = 0
122
- for run in serialized_data_point["Runs"]:
123
- if run["date"] == latest_timestamp:
124
- run["date"] = run["date"].isoformat()
125
- activeRuns.append(run)
126
- loss += run["loss"]
127
- velocity += run["velocity"]
128
- loss = loss / len(activeRuns) if activeRuns else 0
129
- runtime += run["runtime"]
130
- batches += run["batches"]
131
- new_item = {
132
- "date": latest_timestamp.isoformat(),
133
- "profileId": run_name,
134
- "batches": batches,
135
- "runtime": runtime,
136
- "activeRuns": activeRuns,
137
- }
138
- serialized_data_points_v2.append(new_item)
139
- serialized_data = {"points": [serialized_data_points_v2], "maxVelocity": max_velocity}
140
- return serialized_data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
perso/change_data.py DELETED
@@ -1,19 +0,0 @@
1
- import json
2
- import random
3
-
4
- with open(
5
- "/mnt/storage/Documents/hugging_face/colaborative_hub_training/demo_neurips/training-transformers-together-dashboard/data/"
6
- "serializaledata.json",
7
- "r",
8
- ) as f:
9
- serialized_data = json.load(f)
10
-
11
- serialized_data_v2 = serialized_data
12
- serialized_data_v2["points"] = [[item for item in serialized_data["points"][-1] if random.random() > 0.8]]
13
-
14
- with open(
15
- "/mnt/storage/Documents/hugging_face/colaborative_hub_training/demo_neurips/training-transformers-together-dashboard/data/"
16
- "serializaledata_V2.json",
17
- "w",
18
- ) as f:
19
- f.write(json.dumps(serialized_data_v2))
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
perso/get_usernames.py DELETED
@@ -1,14 +0,0 @@
1
- import json
2
-
3
- with open(
4
- "/mnt/storage/Documents/hugging_face/colaborative_hub_training/demo_neurips/training-transformers-together-dashboard/data/"
5
- "serializaledata_V2.json",
6
- "r",
7
- ) as f:
8
- serialized_data = json.load(f)
9
-
10
- usernames = []
11
- for item in serialized_data["points"][0]:
12
- usernames.append(item["profileId"])
13
-
14
- print(usernames)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
st_helpers.py DELETED
@@ -1,55 +0,0 @@
1
- from typing import Sequence
2
-
3
- import streamlit as st
4
- import streamlit.components.v1 as components
5
-
6
- with open("static/header.html", 'r', encoding='utf-8') as f:
7
- header_html = f.read()
8
- with open("static/header_style.css", 'r', encoding='utf-8') as f:
9
- embeds_style_css = f.read()
10
- with open("static/header_animate.js") as f:
11
- header_animate_js = f.read()
12
- with open("static/content_style.css", 'r', encoding='utf-8') as f:
13
- content_style_css = f.read()
14
- with open("static/meta.html", 'r', encoding='utf-8') as f:
15
- meta_html = f.read()
16
- with open("static/tabs.html", 'r', encoding='utf-8') as f:
17
- tabs_html = f.read()
18
- with open("static/footer.html", 'r', encoding='utf-8') as f:
19
- footer_html = f.read()
20
-
21
-
22
- def make_header():
23
- components.html(f"<style>{embeds_style_css}</style>{header_html}<script>{header_animate_js}</script>", height=260)
24
- st.markdown(meta_html, unsafe_allow_html=True)
25
- st.markdown(f"<style>{content_style_css}</style>", unsafe_allow_html=True) # apply css to the rest of the document
26
- st.markdown(
27
- '<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@4.5.3/dist/css/bootstrap.min.css" integrity="sha384-TX8t27EcRE3e/ihU7zmQxVncDAy5uIKz4rEkgIXeMed4M0jlfIDPvg6uqKI2xXr2" crossorigin="anonymous">',
28
- unsafe_allow_html=True,
29
- )
30
-
31
-
32
- def make_tabs():
33
- components.html(f"{tabs_html}", height=850, scrolling=True)
34
-
35
-
36
- def make_footer():
37
- components.html(f"<style>{content_style_css}</style>{footer_html}", height=110)
38
-
39
-
40
- def content_title(title: str, vspace_before: int = 0, vspace_after: int = 0):
41
- st.markdown(f'<center><div class="padded faded demo_title" '
42
- f'style="padding-top: {vspace_before}px; padding-bottom: {vspace_after}px; text-align: justify;">'
43
- f'{title}</div><center>',
44
- unsafe_allow_html=True)
45
-
46
-
47
- def content_text(text: str, vspace_before: int = 0, vspace_after: int = 0):
48
- st.markdown(f'<center><div class="padded faded demo_text" '
49
- f'style="padding-top: {vspace_before}px; padding-bottom: {vspace_after}px; text-align: justify;">'
50
- f'{text}</div><center>',
51
- unsafe_allow_html=True)
52
-
53
-
54
- def cite(tag, link):
55
- return f"""<a target="_blank" rel="noopener noreferrer" href="{link}">{tag}</a>"""
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/content_style.css DELETED
@@ -1,73 +0,0 @@
1
- .faded {
2
- margin: 0 auto;
3
- background: var(--window-color);
4
- box-shadow: 0 0 1px 1px var(--window-color);
5
- font-family: cursive;
6
- font-family: "Gill Sans", sans-serif;
7
- display: inline-block
8
- }
9
- .centered {
10
- text-align: center;
11
- }
12
- .padded {
13
- width: 100%;
14
- max-width: 800px;
15
- text-align: left;
16
- }
17
- .demo_title {
18
- font-size: 32px;
19
- box-shadow: 0 0 5px 5px var(--window-color);
20
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,
21
- sans-serif,Apple Color Emoji,Segoe UI Emoji;
22
- }
23
- .demo_text {
24
- font-size: 16px;
25
- box-shadow: 0 0 5px 5px var(--window-color);
26
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,
27
- sans-serif,Apple Color Emoji,Segoe UI Emoji;
28
- }
29
- .arxiv_button {
30
- position: relative;
31
- display: inline-block;
32
- width: 80px;
33
- height: 28px;
34
- background-image: linear-gradient(180deg, #fafbfc, #eff3f6 90%);
35
- color: #24292e;
36
- border: 1px solid rgba(27,31,35,.2);
37
- text-align: center;
38
- cursor: pointer;
39
- border-radius: 4px;
40
- padding-right: 0px;
41
- padding-top: 2.5px;
42
- font-size: 12px;
43
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,sans-serif;
44
- font-weight: 600;
45
- }
46
- .arxiv_button:before {
47
- content: "";
48
- vertical-align:middle;
49
- display: inline-block;
50
- width: 24px;
51
- height: 24px;
52
- border: none;
53
- margin-left: -16px;
54
- margin-right: 4px;
55
- margin-top: -2px;
56
- background: url('data:image/svg+xml;charset=UTF-8,<svg xmlns="http://www.w3.org/2000/svg" class="ionicon s-ion-icon" viewBox="0 0 512 512"><path d="M428 224H288a48 48 0 01-48-48V36a4 4 0 00-4-4h-92a64 64 0 00-64 64v320a64 64 0 0064 64h224a64 64 0 0064-64V228a4 4 0 00-4-4z"></path><path d="M419.22 188.59L275.41 44.78a2 2 0 00-3.41 1.41V176a16 16 0 0016 16h129.81a2 2 0 001.41-3.41z"></path></svg>') right center no-repeat;
57
- background-size: 18px 16px;
58
- }
59
- .arxiv_button:hover {
60
- background-color:#e6ebf1;
61
- background-position:-0.5em;
62
- border-color: #9fa4a9;
63
- border-color:rgba(27,31,35,.35);
64
- background-image:linear-gradient(180deg, #f0f3f6, #e6ebf1 90%)
65
- }
66
- /* a:link {
67
- color: #00194a;
68
- text-decoration: none;
69
- }
70
- a:visited {
71
- color: #3f004a;
72
- text-decoration: none;
73
- } */
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/footer.html DELETED
@@ -1,15 +0,0 @@
1
- <div style="width: 260px; margin:0 auto;">
2
- <div style="margin: 0 auto; margin-top: 0px;">
3
- <a class="github-button" href="https://github.com/learning-at-home/hivemind" data-size="large" data-show-count="false" aria-label="Star learning-at-home/hivemind on GitHub">Code</a>
4
- <div style="overflow: hidden; white-space: nowrap; margin: 0 auto; display: inline-block;">
5
- <button onclick="window.open('https://arxiv.org/abs/2106.10207');"
6
- class="arxiv_button">Paper</button>
7
- </div>
8
- <a href="https://twitter.com/intent/tweet?hashtags=neurips,joinhivemind&text=Join%20the%20deep%20learning%20hivemind!%0Atraining-transformers-together.github.io"
9
- class="twitter-hashtag-button" data-show-count="true" data-size="large">Tweet</a>
10
- <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
11
-
12
- <script async defer src="https://buttons.github.io/buttons.js"></script>
13
- </div>
14
- <hr style="margin-bottom: 64px; width:0%; border: 0 solid white">
15
- </div>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/header.html DELETED
@@ -1,19 +0,0 @@
1
- <div id="container">
2
- <canvas></canvas>
3
- <div id="overlay">
4
- <div id="main_window">
5
- <div id="header">
6
- <img src="https://learning-at-home.github.io/logo.png" id="bug-logo"
7
- style="width: 40%; max-height: 320px; max-width: 320px; z-index:1000; position: relative;">
8
- <br>
9
- <h1 class="faded title" style="margin-top:-5%;">
10
- <p style="margin-top: 0px; margin-bottom:0px;">
11
- <span id="title_text">Training Transformers Together</span>
12
- </p>
13
- <p style="font-size: 18px; margin-top:0px; margin-bottom:0px;">
14
- large-scale training for everyone,&nbsp;by&nbsp;everyone</p>
15
- </h1>
16
- </div>
17
- </div>
18
- </div>
19
- </div>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/header_animate.js DELETED
@@ -1,223 +0,0 @@
1
- // draw background; Note: this background is based on https://codepen.io/pawelqcm/pen/oxPYox by Pawel
2
- // Note 2: Pawel, you're awesome.
3
- (function() {
4
- var content_element = document.getElementById("overlay");
5
- var canvas = document.querySelector('canvas');
6
- var title_elem = document.getElementsByClassName("faded title")[0];
7
- var title_text = document.getElementById("title_text");
8
- ctx = canvas.getContext('2d');
9
- if (!ctx)
10
- console.warn("Your browser does not support canvas, content may be broken :'(");
11
-
12
- var SENSITIVITY, SIBLINGS_LIMIT, DENSITY, TOTAL_NODES, ANCHOR_LENGTH, CURSOR_HEIGHT, CURSOR_WIDTH;
13
- css_opts = getComputedStyle(document.documentElement);
14
- SENSITIVITY = css_opts.getPropertyValue('--background-sensitivity') || 120;
15
- SIBLINGS_LIMIT = css_opts.getPropertyValue('--background-siblings') || 7;
16
- NODE_DENSITY = css_opts.getPropertyValue('--background-node-density') || 6;
17
- CURSOR_WIDTH = css_opts.getPropertyValue('--background-cursor-width') || 250;
18
- CURSOR_HEIGHT = css_opts.getPropertyValue('--background-cursor-height') || 250;
19
- CURSOR_VERTICAL_SHRINK = css_opts.getPropertyValue('--background-cursor-vertical-shrink') || 0.1;
20
- SPEED_COEF = css_opts.getPropertyValue('--background-speed') || 1;
21
- ENERGY_DECAY = css_opts.getPropertyValue('--energy-decay') || 2;
22
- SHOW_IF_WIDER_THAN = css_opts.getPropertyValue('--background-show-if-wider-than') || 500;
23
- MOVE_ON_CURSOR = css_opts.getPropertyValue('--background-move-on-cursor').includes("true") || false;
24
-
25
- var nodes = [];
26
- choice = (choices => choices[Math.floor(Math.random() * choices.length)])
27
- sample_color = () => choice([[40, 40, 40], [133, 133, 133]])
28
-
29
- ANCHOR_LENGTH = 20;
30
-
31
- var cursor = {x: 0, y: 0};
32
-
33
- function centralize_cursor() {
34
- var rect = document.getElementById("bug-logo").getBoundingClientRect()
35
- var window_left = window.pageXOffset || document.documentElement.scrollLeft;
36
- var window_top = window.pageYOffset || document.documentElement.scrollTop;
37
- cursor.x = window_left + rect.left + rect.width / 2;
38
- cursor.y = window_top + rect.top + rect.height / 2;
39
- }
40
-
41
- function Node(x, y) {
42
- this.anchorX = x;
43
- this.anchorY = y;
44
- this.x = Math.random() * (x - (x - ANCHOR_LENGTH)) + (x - ANCHOR_LENGTH);
45
- this.y = Math.random() * (y - (y - ANCHOR_LENGTH)) + (y - ANCHOR_LENGTH);
46
- this.vx = (Math.random() * 2 - 1) * SPEED_COEF;
47
- this.vy = (Math.random() * 2 - 1) * SPEED_COEF;
48
- this.energy = Math.random() * 100;
49
- this.radius = Math.random();
50
- this.siblings = [];
51
- [this.r, this.g, this.b] = sample_color()
52
- this.brightness = 0;
53
- }
54
-
55
- Node.prototype.drawNode = function() {
56
- var color = `rgba(${this.r}, ${this.g}, ${this.b}, ${this.brightness})`;
57
- ctx.beginPath();
58
- ctx.arc(this.x, this.y, 2 * this.radius + 2 * this.siblings.length / SIBLINGS_LIMIT, 0, 2 * Math.PI);
59
- ctx.fillStyle = color;
60
- ctx.fill();
61
- };
62
-
63
- Node.prototype.drawConnections = function() {
64
- for (var i = 0; i < this.siblings.length; i++) {
65
- var color = `rgba(133, 133, 133, ${this.brightness})`;
66
- ctx.beginPath();
67
- ctx.moveTo(this.x, this.y);
68
- ctx.lineTo(this.siblings[i].x, this.siblings[i].y);
69
- ctx.lineWidth = 1 - calcDistance(this, this.siblings[i]) / SENSITIVITY;
70
- ctx.strokeStyle = color;
71
- ctx.stroke();
72
- }
73
- };
74
-
75
-
76
- Node.prototype.moveNode = function() {
77
- this.energy -= ENERGY_DECAY;
78
- if (this.energy < 1) {
79
- this.energy = Math.random() * 100;
80
- if (this.x - this.anchorX < -ANCHOR_LENGTH) {
81
- this.vx = Math.random() * SPEED_COEF;
82
- } else if (this.x - this.anchorX > ANCHOR_LENGTH) {
83
- this.vx = Math.random() * -SPEED_COEF;
84
- } else {
85
- this.vx = Math.random() * SPEED_COEF * 2 - SPEED_COEF;
86
- }
87
- if (this.y - this.anchorY < -ANCHOR_LENGTH) {
88
- this.vy = Math.random() * SPEED_COEF;
89
- } else if (this.y - this.anchorY > ANCHOR_LENGTH) {
90
- this.vy = Math.random() * -SPEED_COEF;
91
- } else {
92
- this.vy = Math.random() * SPEED_COEF * 2 - SPEED_COEF;
93
- }
94
- }
95
- relative_speed_rate = Math.min(canvas.height / 100, 10.0)
96
- this.x += this.vx * this.energy * relative_speed_rate;
97
- this.y += this.vy * this.energy * relative_speed_rate;
98
- };
99
-
100
- function initNodes() {
101
- centralize_cursor();
102
- ctx.clearRect(0, 0, canvas.width, canvas.height);
103
- if (canvas.width >= SHOW_IF_WIDER_THAN)
104
- total_nodes = Math.round(NODE_DENSITY * (canvas.width / 100 * canvas.height / 100));
105
- else
106
- total_nodes = 0;
107
- nodes = [];
108
- for (var i = 0; i < total_nodes; i++)
109
- nodes.push(new Node(50 + Math.random() * (canvas.width - 100),
110
- 5 + Math.random() * (canvas.height - 10)));
111
- }
112
-
113
- function calcDistance(node1, node2) {
114
- return Math.sqrt(Math.pow(node1.x - node2.x, 2) + (Math.pow(node1.y - node2.y, 2)));
115
- }
116
-
117
- function findSiblings() {
118
- var node1, node2, distance;
119
- for (var i = 0; i < nodes.length; i++) {
120
- node1 = nodes[i];
121
- node1.siblings = [];
122
- for (var j = 0; j < nodes.length; j++) {
123
- node2 = nodes[j];
124
- if (node1 !== node2) {
125
- distance = calcDistance(node1, node2);
126
- if (distance < SENSITIVITY) {
127
- if (node1.siblings.length < SIBLINGS_LIMIT) {
128
- node1.siblings.push(node2);
129
- } else {
130
- var node_sibling_distance = 0;
131
- var max_distance = 0;
132
- var s;
133
- for (var k = 0; k < SIBLINGS_LIMIT; k++) {
134
- node_sibling_distance = calcDistance(node1, node1.siblings[k]);
135
- if (node_sibling_distance > max_distance) {
136
- max_distance = node_sibling_distance;
137
- s = k;
138
- }
139
- }
140
- if (distance < max_distance) {
141
- node1.siblings.splice(s, 1);
142
- node1.siblings.push(node2);
143
- }
144
- }
145
- }
146
- }
147
- }
148
- }
149
- }
150
-
151
- function redrawScene() {
152
- resizeWindow();
153
- ctx.clearRect(0, 0, canvas.width, canvas.height);
154
- findSiblings();
155
- var i, node, distance;
156
- for (i = 0; i < nodes.length; i++) {
157
- node = nodes[i];
158
- scaled_distance = calcDistance({x: cursor.x / CURSOR_WIDTH, y: cursor.y / CURSOR_HEIGHT},
159
- {x: node.x / CURSOR_WIDTH, y: node.y / CURSOR_HEIGHT});
160
-
161
- node.brightness = Math.max(1 - scaled_distance, 0);
162
- }
163
- for (i = 0; i < nodes.length; i++) {
164
- node = nodes[i];
165
- if (node.brightness) {
166
- node.drawConnections();
167
- node.drawNode();
168
- }
169
- node.moveNode();
170
- }
171
- requestAnimationFrame(redrawScene);
172
- }
173
-
174
- function initHandlers() {
175
- document.addEventListener('resize', resizeWindow);
176
- document.addEventListener('orientationchange', resizeWindow);
177
- if (MOVE_ON_CURSOR) {
178
- document.addEventListener('mousemove', moveHandler);
179
- document.addEventListener('touchmove', moveHandler);
180
- }
181
- }
182
-
183
- function resizeWindow(evt) {
184
- var new_width, new_height;
185
- new_width = Math.round(Math.max(title_elem.getBoundingClientRect().right, window.innerWidth))
186
- if (screen.width < 640)
187
- title_text.style.fontSize = "24px";
188
- else
189
- title_text.style.fontSize = "32px";
190
-
191
-
192
- if (!MOVE_ON_CURSOR)
193
- new_height = Math.round(title_elem.getBoundingClientRect().top - canvas.getBoundingClientRect().top);
194
- else
195
- new_height = Math.round(Math.max(
196
- content_element.offsetHeight, content_element.scrollHeight,
197
- content_element.clientHeight, window.innerHeight));
198
-
199
- if (canvas.width != new_width || canvas.height != new_height) {
200
- canvas.width = new_width;
201
- canvas.height = new_height;
202
- initNodes();
203
- }
204
- if (!MOVE_ON_CURSOR)
205
- centralize_cursor();
206
- }
207
-
208
- function moveHandler(evt) {
209
- if (evt.type == "mousemove") {
210
- cursor.x = window.pageXOffset + evt.clientX;
211
- cursor.y = window.pageYOffset + evt.clientY;
212
- }
213
- else { // touch event
214
- cursor.x = window.pageXOffset + evt.changedTouches[0].clientX;
215
- cursor.y = window.pageYOffset + evt.changedTouches[0].clientY;
216
- }
217
- }
218
-
219
- initHandlers();
220
- initNodes();
221
- redrawScene();
222
-
223
- })();
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/header_style.css DELETED
@@ -1,148 +0,0 @@
1
- :root {
2
- --border-color: black;
3
- --window-color: white;
4
- --background-move-on-cursor: false;
5
- --background-color: white;
6
- --background-cursor-width: 400;
7
- --background-cursor-height: 200;
8
- --background-show-if-wider-than: 500;
9
- --background-speed: 0.001;
10
- --energy-decay: 0.3;
11
- }
12
- body {
13
- width: 100%;
14
- margin: 0 auto;
15
- background-color: var(--background-color);
16
- }
17
- #container {
18
- position: relative;
19
- width: 100%;
20
- margin: 0 auto;
21
- }
22
- #container canvas, #overlay {
23
- width: 100%;
24
- margin: 0 auto;
25
- position: absolute;
26
- }
27
- canvas {
28
- background-color: var(--background-color);
29
- width: 0px; /* will be changed on init */
30
- overflow: hidden;
31
- }
32
- #main_window {
33
- width: 80%;
34
- min-width: 320px;
35
- margin: 0 auto;
36
- text-align: center;
37
- }
38
- .faded {
39
- margin: 0 auto;
40
- background: var(--window-color);
41
- box-shadow: 0 0 5px 5px var(--window-color);
42
- font-family: cursive;
43
- font-family: "Gill Sans", sans-serif;
44
- display: inline-block
45
- }
46
- .padded {
47
- width: 100%;
48
- max-width: 800px;
49
- text-align: left;
50
- }
51
- .title {
52
- font-size: 32px;
53
- box-shadow: 0 0 5px 5px var(--window-color);
54
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,
55
- sans-serif,Apple Color Emoji,Segoe UI Emoji;
56
- }
57
- .text {
58
- font-size: 16px;
59
- box-shadow: 0 0 5px 5px var(--window-color);
60
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,
61
- sans-serif,Apple Color Emoji,Segoe UI Emoji;
62
- }
63
- .scrollbar {
64
- overflow-y: scroll;
65
- }
66
- .arxiv_button {
67
- position: relative;
68
- display: inline-block;
69
- width: 80px;
70
- height: 28px;
71
- background-image: linear-gradient(180deg, #fafbfc, #eff3f6 90%);
72
- color: #24292e;
73
- border: 1px solid rgba(27,31,35,.2);
74
- text-align: center;
75
- cursor: pointer;
76
- border-radius: 4px;
77
- padding-right: 0px;
78
- padding-top: 2.5px;
79
- font-size: 12px;
80
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,sans-serif;
81
- font-weight: 600;
82
- }
83
- .arxiv_button:before {
84
- content: "";
85
- vertical-align:middle;
86
- display: inline-block;
87
- width: 24px;
88
- height: 24px;
89
- border: none;
90
- margin-left: -16px;
91
- margin-right: 4px;
92
- margin-top: -2px;
93
- background: url('data:image/svg+xml;charset=UTF-8,<svg xmlns="http://www.w3.org/2000/svg" class="ionicon s-ion-icon" viewBox="0 0 512 512"><path d="M428 224H288a48 48 0 01-48-48V36a4 4 0 00-4-4h-92a64 64 0 00-64 64v320a64 64 0 0064 64h224a64 64 0 0064-64V228a4 4 0 00-4-4z"></path><path d="M419.22 188.59L275.41 44.78a2 2 0 00-3.41 1.41V176a16 16 0 0016 16h129.81a2 2 0 001.41-3.41z"></path></svg>') right center no-repeat;
94
- background-size: 18px 16px;
95
- }
96
- .arxiv_button:hover {
97
- background-color:#e6ebf1;
98
- background-position:-0.5em;
99
- border-color: #9fa4a9;
100
- border-color:rgba(27,31,35,.35);
101
- background-image:linear-gradient(180deg, #f0f3f6, #e6ebf1 90%)
102
- }
103
- /* a:link {
104
- color: #00194a;
105
- text-decoration: none;
106
- }
107
- a:visited {
108
- color: #3f004a;
109
- text-decoration: none;
110
- } */
111
- .tooltip {
112
- position: relative;
113
- display: inline-block;
114
- border-bottom: 1px dotted black;
115
- }
116
-
117
- .tooltip .tooltiptext {
118
- visibility: hidden;
119
- width: 240px;
120
- background-color: #555;
121
- color: #fff;
122
- text-align: center;
123
- border-radius: 6px;
124
- padding: 5px 0;
125
- position: absolute;
126
- z-index: 1;
127
- bottom: 125%;
128
- left: 50%;
129
- margin-left: -60px;
130
- opacity: 0;
131
- transition: opacity 0.3s;
132
- }
133
-
134
- .tooltip .tooltiptext::after {
135
- content: "";
136
- position: absolute;
137
- top: 100%;
138
- left: 50%;
139
- margin-left: -5px;
140
- border-width: 5px;
141
- border-style: solid;
142
- border-color: #555 transparent transparent transparent;
143
- }
144
-
145
- .tooltip:hover .tooltiptext {
146
- visibility: visible;
147
- opacity: 1;
148
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/meta.html DELETED
@@ -1,21 +0,0 @@
1
- <meta http-equiv="Content-Type" content="text/html; charset=utf-8"/>
2
- <title>Training Transformers Together</title>
3
- <meta name="description" content="A NeurIPS'21 demonstration that explains how to train large models together with multiple collaborators.">
4
- <link rel="mask-icon" href="https://learning-at-home.github.io/logo_small.png">
5
- <link rel="alternate icon" class="js-site-favicon" type="image/png" href="https://learning-at-home.github.io/logo.png">
6
- <link rel="icon" class="js-site-favicon" type="image/png" href="https://learning-at-home.github.io/logo.png">
7
- <meta property="og:url" content="https://training-transformers-together.github.io">
8
- <meta property="og:site_name" content="learning@home">
9
- <meta property="og:title" content="Train vast neural networks together">
10
- <meta property="og:description" content="A NeurIPS'21 demonstration that explains how to train large models together with multiple collaborators.">
11
- <meta property="og:image" content="https://learning-at-home.github.io/logo_small.png">
12
- <meta property="og:image:type" content="image/png">
13
- <meta property="og:image:width" content="96">
14
- <meta property="og:image:height" content="96">
15
- <meta property="twitter:site" content="https://training-transformers-together.github.io">
16
- <meta property="twitter:creator" content="Yandex, Huggingface, Hivemind team & contributors">
17
- <meta property="twitter:card" content="summary_large_image">
18
- <meta property="twitter:title" content="learning@home">
19
- <meta property="twitter:description" content="A NeurIPS'21 demonstration that explains how to train large models together with multiple collaborators.">
20
- <meta property="twitter:image:src" content="https://learning-at-home.github.io/logo_horizontal.png">
21
- <meta name="viewport" content="width=device-width, initial-scale=1">
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
static/tabs.html DELETED
@@ -1,270 +0,0 @@
1
- <html lang="en">
2
- <head>
3
- <title>Bootstrap Example</title>
4
- <meta charset="utf-8">
5
- <meta name="viewport" content="width=device-width, initial-scale=1">
6
- <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.4.1/css/bootstrap.min.css">
7
- <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
8
- <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.4.1/js/bootstrap.min.js"></script>
9
- <style>
10
- .faded {
11
- margin: 0 auto;
12
- background: var(--window-color);
13
- box-shadow: 0 0 5px 5px var(--window-color);
14
- font-family: cursive;
15
- font-family: "Gill Sans", sans-serif;
16
- display: inline-block
17
- }
18
- .padded {
19
- width: 100%;
20
- max-width: 800px;
21
- text-align: left;
22
- }
23
- .demo_title {
24
- font-size: 32px;
25
- box-shadow: 0 0 5px 5px var(--window-color);
26
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,
27
- sans-serif,Apple Color Emoji,Segoe UI Emoji;
28
- }
29
- .demo_text {
30
- font-size: 16px;
31
- box-shadow: 0 0 5px 5px var(--window-color);
32
- font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,
33
- sans-serif,Apple Color Emoji,Segoe UI Emoji;
34
- }
35
- .tab-group {
36
- font-size: 15px;
37
- }
38
- .tab-content {
39
- margin-top: 16px;
40
- }
41
- ul > li {
42
- margin: 3px 0;
43
- }
44
- ol > li {
45
- margin: 5px 0;
46
- }
47
- /* a:link {
48
- color: #00194a;
49
- text-decoration: none;
50
- }
51
- a:visited {
52
- color: #3f004a;
53
- text-decoration: none;
54
- } */
55
- </style>
56
- </head>
57
- <body>
58
-
59
- <div class="tab-group" style="width: 100%; margin:0 auto;">
60
- <div>
61
- <!-- Nav tabs -->
62
- <ul class="nav nav-tabs" role="tablist">
63
- <li role="presentation" class="active"><a href="#tab1" aria-controls="tab1" role="tab" data-toggle="tab">Memory-Efficient Training</a></li>
64
- <li role="presentation"><a href="#tab2" aria-controls="tab2" role="tab" data-toggle="tab">Security</a></li>
65
- <li role="presentation"><a href="#tab3" aria-controls="tab3" role="tab" data-toggle="tab">Make Your Own</a></li>
66
- </ul>
67
-
68
- <!-- Tab panes -->
69
- <div class="tab-content">
70
- <div role="tabpanel" class="tab-pane active" id="tab1">
71
- <p>
72
- Our aim is to train a large model in a decentralized fashion on consumer hardware or low-end cloud instances.
73
- This means we need to make the model, dataset, and other memory buffers fit onto a few GB of disk, 12-16 GB of CPU RAM,
74
- and 8-12 GB of GPU memory. Unfortunately, this rules out many popular techniques such as
75
- <a target="_blank" rel="noopener noreferrer" href="https://arxiv.org/abs/2101.06840">ZeRO-Offload</a>:
76
- there is simply not enough RAM for that. Instead, we must make better use of what limited memory we have.
77
- To do this, we use two techniques: 8-bit Optimizers for GPU memory and dataset streaming for RAM & HDD.
78
- </p>
79
- <p>
80
- <b>8-bit Optimizers:</b>
81
- Using optimizers such as LAMB or Adam requires four times as much GPU memory as simply storing model parameters (8 bytes vs 2 bytes).
82
- As such, for training large models with many parameters the optimizers make up the largest chunk of memory.
83
- With 8-bit optimizers this memory is reduced by 75% (2 bytes) making it much easier to fit large models onto consumer GPUs.
84
- </p><p>
85
- Naturally, we can combine this technique with offloading: storing 8-bit optimizer states in CPU memory rather
86
- than GPU memory (0 bytes GPU, 2 bytes CPU). To perform an optimizer update, we transfer the GPU gradients
87
- to the CPU, perform the optimizer update, and then transfer the updated weights to the GPU.
88
- We can do this for each weight one-by-one so that additional CPU memory required for the optimizer update
89
- is minimal.
90
- The combination of offloading and 8-bit optimizers means that we conserve GPU memory (0 bytes per parameter)
91
- and also use only a limited amount of CPU memory (2 bytes per parameter).
92
- </p>
93
- <p>
94
- <b>Dataset Streaming</b>
95
- Usually data is stored on disk and needs to be fully or partially loaded into CPU memory to be used for training.
96
- Large datasets used for pre-training measure in <a target="_blank" rel="noopener noreferrer" href="https://arxiv.org/abs/2101.00027">hundreds of gigabytes</a>
97
- or even <a target="_blank" rel="noopener noreferrer" href="https://laion.ai/laion-400-open-dataset/">terabytes</a>.
98
- This can pose a significant problem, as most desktop and cheap cloud instance simply do not have that much space.
99
- Furthermore, downloading the dataset over the internet would take up hours before one can even begin training.
100
- <!--Changing the dataset means downloading a new dataset in full and using additional disk space.-->
101
- </p>
102
- <p>
103
- To circumvent these problems, we stream the training dataset in the same way as you stream online videos.
104
- Participants download a small random portion of the training dataset and immediately begin training on it,
105
- while additional data is loaded in background. As such, we can train a model with virtually no memory
106
- overhead from the dataset and switching to a new dataset is as simple as changing an argument to the dataset class.
107
- </p>
108
- <center>
109
- Here's a tutorial for using these techniques:<br>
110
- <a target="_blank" rel="noopener noreferrer" href="https://colab.research.google.com/gist/justheuristic/75f6a2a731f05a213a55cd2c8a458aaf/fine-tune-a-language-model-with-dataset-streaming-and-8-bit-optimizers.ipynb">
111
- <img src="https://colab.research.google.com/assets/colab-badge.svg" width=360px>
112
- </a>
113
- </center>
114
-
115
- </div>
116
- <div role="tabpanel" class="tab-pane" id="tab2">
117
- <p>In this section, we discuss common concerns related to security of the collaborative training.</p>
118
-
119
- <p>
120
- <b>Q: If I join a collaborative training, do I allow other people to execute code on my computer?</b>
121
- </p>
122
-
123
- <p>
124
- <b>A:</b> During the training, participants only exchange data (gradients, statistics, model weights) and never send code to each other.
125
- No other peer can execute code on your computer.
126
- </p>
127
-
128
- <p>
129
- To join the training, you typically need to run the code (implementing the model, data streaming, training loop, etc.)
130
- from a repository or a Colab notebook provided by the authors of the experiment.
131
- This is no different from running any other open source project/Colab notebook.
132
- </p>
133
-
134
- <p>
135
- <b>Q: Can a malicious participant influence the training outcome?</b>
136
- </p>
137
-
138
- <p>
139
- <b>A:</b> It is indeed possible unless we use some defense mechanism.
140
- For instance, a malicious participant can damage model weights by sending large numbers instead of the correct gradients.
141
- The same can happen due to broken hardware or misconfiguration.
142
- </p>
143
-
144
- <ul>
145
- <li>
146
- <p>
147
- One possible defense is using <b>authentication</b> combined with <b>model checkpointing</b>.
148
- In this case, participants should log in (e.g. with their Hugging Face account) to interact with the rest of the collaboration.
149
- In turn, moderators can screen potential participants and add them to an allowlist.
150
- If something goes wrong (e.g. if a participant sends invalid gradients and the model diverges),
151
- the moderators remove them from the list and revert the model to the latest checkpoint unaffected by the attack.
152
- </p>
153
-
154
- <!-- <p><b>Spoiler (TODO): How to implement authentication in a decentralized system efficiently?</b></p>-->
155
-
156
- <p>
157
- Nice bonus: using this data, the moderators can acknowledge the personal contribution of each participant.
158
- </p>
159
- </li>
160
- <li>
161
- <p>
162
- Another defense is replacing the naive averaging of the peers' gradients with an <b>aggregation technique robust to outliers</b>.
163
- <a target="_blank" rel="noopener noreferrer" href="https://arxiv.org/abs/2012.10333">Karimireddy et al. (2020)</a>
164
- suggested such a technique (named CenteredClip) and proved that it does not significantly affect the model's convergence.
165
- </p>
166
-
167
- <!-- <p><b>Spoiler (TODO): How does CenteredClip protect from outliers? (Interactive Demo)</b></p>-->
168
-
169
- <p>
170
- In our case, CenteredClip is useful but not enough to protect from malicious participants,
171
- since it implies that the CenteredClip procedure itself is performed by a trusted server.
172
- In contrast, in our decentralized system, all participants can aggregate a part of the gradients and we cannot assume all of them to be trusted.
173
- </p>
174
-
175
- <p>
176
- Recently, <a target="_blank" rel="noopener noreferrer" href="https://arxiv.org/abs/2106.11257">Gorbunov et al. (2021)</a>
177
- proposed a robust aggregation protocol for decentralized systems that does not require this assumption.
178
- This protocol uses CenteredClip as a subroutine but is able to detect and ban participants who performed it incorrectly.
179
- </p>
180
- </li>
181
- </ul>
182
- </div>
183
- <div role="tabpanel" class="tab-pane" id="tab3">
184
- <p>In this section, we provide a roadmap for you to run the collaborative training yourself.</p>
185
- <p>
186
- <b>Got confused?</b> Feel free to ask any questions at our <a target="_blank" rel="noopener noreferrer" href="https://discord.gg/uGugx9zYvN">Discord</a>!
187
- </p>
188
- <ol>
189
- <li>
190
- Set up dataset streaming:
191
- <ul>
192
- <li>
193
- <a target="_blank" rel="noopener noreferrer" href="https://huggingface.co/docs/datasets/share_dataset.html">Upload</a> your dataset to Hugging Face Hub
194
- in a streaming-friendly format (<a target="_blank" rel="noopener noreferrer" href="https://huggingface.co/datasets/laion/laion_100m_vqgan_f8">example</a>).
195
- </li>
196
- <li>Set up dataset streaming (see the "Efficient Training" section).</li>
197
- </ul>
198
- </li>
199
- <li>
200
- Write code of training peers (<a target="_blank" rel="noopener noreferrer" href="https://github.com/learning-at-home/dalle-hivemind/blob/main/run_trainer.py">example</a>):
201
- <ul>
202
- <li>Implement your model, set up dataset streaming, and write the training loop.</li>
203
- <li>
204
- Get familiar with the hivemind library
205
- (e.g., via the <a target="_blank" rel="noopener noreferrer" href="https://learning-at-home.readthedocs.io/en/latest/user/quickstart.html">quickstart</a>).
206
- </li>
207
- <li>
208
- In the training loop, wrap up your PyTorch optimizer with
209
- <a target="_blank" rel="noopener noreferrer" href="https://learning-at-home.readthedocs.io/en/latest/modules/optim.html#hivemind.optim.experimental.optimizer.Optimizer">hivemind.Optimizer</a>
210
- (<a target="_blank" rel="noopener noreferrer" href="https://github.com/learning-at-home/dalle-hivemind/blob/main/task.py#L121">example</a>).
211
- </li>
212
- </ul>
213
- </li>
214
- <li>
215
- <b>(optional)</b> Write code of auxiliary peers (<a target="_blank" rel="noopener noreferrer" href="https://github.com/learning-at-home/dalle-hivemind/blob/main/run_aux_peer.py">example</a>):
216
- <ul>
217
- <li>
218
- Auxiliary peers a special kind of peers responsible for
219
- logging loss and other metrics (e.g., to <a target="_blank" rel="noopener noreferrer" href="https://wandb.ai/">Weights & Biases</a>)
220
- and uploading model checkpoints (e.g., to <a target="_blank" rel="noopener noreferrer" href="https://huggingface.co/docs/transformers/model_sharing">Hugging Face Hub</a>).
221
- </li>
222
- <li>
223
- Such peers don't need to calculate gradients and may be run on cheap machines without GPUs.
224
- </li>
225
- <li>
226
- They can serve as a convenient entry point to
227
- <a target="_blank" rel="noopener noreferrer" href="https://learning-at-home.readthedocs.io/en/latest/modules/dht.html">hivemind.DHT</a>
228
- (i.e., their address can be specified as <code>initial_peers</code>).
229
- </li>
230
- <li>
231
- It is useful to fix their address by providing <code>host_maddrs</code> and <code>identity_path</code>
232
- arguments to <code>hivemind.DHT</code>
233
- (these are forwarded to the underlying <a target="_blank" rel="noopener noreferrer" href="https://libp2p.io/">libp2p</a> daemon).
234
- </li>
235
- </ul>
236
- </li>
237
- <li>
238
- <b>(optional)</b> Make it easier for other people to join:
239
- <ul>
240
- <li>
241
- Create notebooks for free GPU providers (Google Colab, Kaggle, AWS SageMaker, etc.).
242
- People may run them online and/or download and run them on their own hardware.
243
- </li>
244
- <li>
245
- <a target="_blank" rel="noopener noreferrer" href="https://huggingface.co/organizations/new">Create</a> a Hugging Face organization
246
- with all resources related to the training
247
- (dataset, model, inference demo, links to a dashboard with loss and other metrics, etc.).
248
- Look at <a target="_blank" rel="noopener noreferrer" href="https://huggingface.co/training-transformers-together">ours</a> as an example.
249
- </li>
250
- <li>
251
- Set up an authentication system (see the "Security" section).
252
- For example, you can ask people to join your organization with their Hugging Face accounts
253
- (Hugging Face allows to share a link for joining or manually approve new participants).
254
- This allows you to screen participants,
255
- acknowledge their contributions (e.g., make a leaderboard), and
256
- ban accounts who behave maliciously.
257
- </li>
258
- <li>
259
- Set up an inference demo for your model (e.g., using <a target="_blank" rel="noopener noreferrer" href="https://huggingface.co/spaces">Spaces</a>) or
260
- a script that periodically uploads the inference results to show the training progress.
261
- </li>
262
- </ul>
263
- </li>
264
- </ol>
265
- </div>
266
- </div>
267
-
268
- </div>
269
- </div>
270
- </body>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/__init__.py DELETED
@@ -1,71 +0,0 @@
1
- import os
2
-
3
- import streamlit.components.v1 as components
4
-
5
- _RELEASE = True
6
-
7
- if not _RELEASE:
8
- _component_func = components.declare_component(
9
- "observable",
10
- url="http://localhost:3001",
11
- )
12
- else:
13
- parent_dir = os.path.dirname(os.path.abspath(__file__))
14
- build_dir = os.path.join(parent_dir, "frontend", "build")
15
- _component_func = components.declare_component("observable", path=build_dir)
16
-
17
-
18
- def observable(key, notebook, targets=None, redefine={}, observe=[], hide=[]):
19
- """Create a new instance of "observable".
20
-
21
- Parameters
22
- ----------
23
- key: str
24
- A unique string used to avoid constant re-renders to the iframe.
25
- notebook: str
26
- The observablehq.com notebook id to embed. Ex. "@"d3/bar-chart"
27
- or "d/1f434ef3b0569a00"
28
- targets: list or None
29
- An optional list of strings that are the name of the cells to embed.
30
- By default, the entire notebook, including unnamed cells, will be embeded.
31
- observe: list or None
32
- An optional list of strings that are the name of cells to observe.
33
- Whenever these cells change value or become fulfilled, the value will
34
- be passed back into Streamlit as part of the return value.
35
- redefine: dict or None
36
- An optional dict containing the cells you wish to redefine and the values
37
- you wish to redefine them as. The keys are the cell names you want to
38
- redefine, the values are what they will be redefined as. Keep in mind,
39
- there is a serialization process from Streamlit Python -> frontend JavaScript.
40
- hide: list or None
41
- An option list of strings that are the names of cells that will be embeded,
42
- but won't be rendered to the DOM.
43
- Returns
44
- -------
45
- dict
46
- An object containing the live observed values. If the observe parameter is
47
- empty, then the dict will be empty. The keys are the name of the cell that
48
- is observe, the values are the values of the cells.
49
-
50
- """
51
- component_value = _component_func(
52
- notebook=notebook, targets=targets, observe=observe, redefine=redefine, hide=hide, key=key, name=key
53
- )
54
-
55
- if component_value is None:
56
- return {}
57
-
58
- return component_value
59
-
60
-
61
- # if not _RELEASE:
62
- # import streamlit as st
63
- # observers = observable("World Tour!",
64
- # notebook="@d3/world-tour",
65
- # targets=["canvas"],
66
- # observe=["name"]
67
- # )
68
-
69
- # name = observers.get("name")
70
-
71
- # st.write(f"Current country: ** *{name}* **")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/frontend/build/asset-manifest.json DELETED
@@ -1,20 +0,0 @@
1
- {
2
- "files": {
3
- "main.js": "./static/js/main.5bdac2e3.chunk.js",
4
- "main.js.map": "./static/js/main.5bdac2e3.chunk.js.map",
5
- "runtime-main.js": "./static/js/runtime-main.11ec9aca.js",
6
- "runtime-main.js.map": "./static/js/runtime-main.11ec9aca.js.map",
7
- "static/js/2.b1c975ff.chunk.js": "./static/js/2.b1c975ff.chunk.js",
8
- "static/js/2.b1c975ff.chunk.js.map": "./static/js/2.b1c975ff.chunk.js.map",
9
- "index.html": "./index.html",
10
- "precache-manifest.8096ee623e3f349cc3813a91ceb00929.js": "./precache-manifest.8096ee623e3f349cc3813a91ceb00929.js",
11
- "service-worker.js": "./service-worker.js",
12
- "static/js/2.b1c975ff.chunk.js.LICENSE.txt": "./static/js/2.b1c975ff.chunk.js.LICENSE.txt",
13
- "static/js/main.5bdac2e3.chunk.js.LICENSE.txt": "./static/js/main.5bdac2e3.chunk.js.LICENSE.txt"
14
- },
15
- "entrypoints": [
16
- "static/js/runtime-main.11ec9aca.js",
17
- "static/js/2.b1c975ff.chunk.js",
18
- "static/js/main.5bdac2e3.chunk.js"
19
- ]
20
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/frontend/build/index.html DELETED
@@ -1 +0,0 @@
1
- <!doctype html><html lang="en"><head><title>Streamlit Component</title><meta charset="UTF-8"/><meta name="viewport" content="width=device-width,initial-scale=1"/><meta name="theme-color" content="#000000"/><meta name="description" content="Streamlit Component"/><link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@observablehq/inspector@3/dist/inspector.css"/></head><body><noscript>You need to enable JavaScript to run this app.</noscript><div id="root"></div><script>!function(e){function t(t){for(var n,l,a=t[0],p=t[1],i=t[2],c=0,s=[];c<a.length;c++)l=a[c],Object.prototype.hasOwnProperty.call(o,l)&&o[l]&&s.push(o[l][0]),o[l]=0;for(n in p)Object.prototype.hasOwnProperty.call(p,n)&&(e[n]=p[n]);for(f&&f(t);s.length;)s.shift()();return u.push.apply(u,i||[]),r()}function r(){for(var e,t=0;t<u.length;t++){for(var r=u[t],n=!0,a=1;a<r.length;a++){var p=r[a];0!==o[p]&&(n=!1)}n&&(u.splice(t--,1),e=l(l.s=r[0]))}return e}var n={},o={1:0},u=[];function l(t){if(n[t])return n[t].exports;var r=n[t]={i:t,l:!1,exports:{}};return e[t].call(r.exports,r,r.exports,l),r.l=!0,r.exports}l.m=e,l.c=n,l.d=function(e,t,r){l.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:r})},l.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},l.t=function(e,t){if(1&t&&(e=l(e)),8&t)return e;if(4&t&&"object"==typeof e&&e&&e.__esModule)return e;var r=Object.create(null);if(l.r(r),Object.defineProperty(r,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var n in e)l.d(r,n,function(t){return e[t]}.bind(null,n));return r},l.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return l.d(t,"a",t),t},l.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},l.p="./";var a=this.webpackJsonpstreamlit_component_template=this.webpackJsonpstreamlit_component_template||[],p=a.push.bind(a);a.push=t,a=a.slice();for(var i=0;i<a.length;i++)t(a[i]);var f=p;r()}([])</script><script src="./static/js/2.b1c975ff.chunk.js"></script><script src="./static/js/main.5bdac2e3.chunk.js"></script></body></html>
 
 
streamlit_observable/frontend/build/precache-manifest.8096ee623e3f349cc3813a91ceb00929.js DELETED
@@ -1,26 +0,0 @@
1
- self.__precacheManifest = (self.__precacheManifest || []).concat([
2
- {
3
- "revision": "2ec6acc026cff43b53185364bb91d16e",
4
- "url": "./index.html"
5
- },
6
- {
7
- "revision": "5a67f673dcdf30bf693d",
8
- "url": "./static/js/2.b1c975ff.chunk.js"
9
- },
10
- {
11
- "revision": "9b318b6fb13190fe82c0677e9264b3c7",
12
- "url": "./static/js/2.b1c975ff.chunk.js.LICENSE.txt"
13
- },
14
- {
15
- "revision": "36b8de6fe4fc2eeae54b",
16
- "url": "./static/js/main.5bdac2e3.chunk.js"
17
- },
18
- {
19
- "revision": "6515c66d2a8747a146d578e1c038a822",
20
- "url": "./static/js/main.5bdac2e3.chunk.js.LICENSE.txt"
21
- },
22
- {
23
- "revision": "7c26bca7e16783d14d15",
24
- "url": "./static/js/runtime-main.11ec9aca.js"
25
- }
26
- ]);
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/frontend/build/service-worker.js DELETED
@@ -1,39 +0,0 @@
1
- /**
2
- * Welcome to your Workbox-powered service worker!
3
- *
4
- * You'll need to register this file in your web app and you should
5
- * disable HTTP caching for this file too.
6
- * See https://goo.gl/nhQhGp
7
- *
8
- * The rest of the code is auto-generated. Please don't update this file
9
- * directly; instead, make changes to your Workbox build configuration
10
- * and re-run your build process.
11
- * See https://goo.gl/2aRDsh
12
- */
13
-
14
- importScripts("https://storage.googleapis.com/workbox-cdn/releases/4.3.1/workbox-sw.js");
15
-
16
- importScripts(
17
- "./precache-manifest.8096ee623e3f349cc3813a91ceb00929.js"
18
- );
19
-
20
- self.addEventListener('message', (event) => {
21
- if (event.data && event.data.type === 'SKIP_WAITING') {
22
- self.skipWaiting();
23
- }
24
- });
25
-
26
- workbox.core.clientsClaim();
27
-
28
- /**
29
- * The workboxSW.precacheAndRoute() method efficiently caches and responds to
30
- * requests for URLs in the manifest.
31
- * See https://goo.gl/S9QRab
32
- */
33
- self.__precacheManifest = [].concat(self.__precacheManifest || []);
34
- workbox.precaching.precacheAndRoute(self.__precacheManifest, {});
35
-
36
- workbox.routing.registerNavigationRoute(workbox.precaching.getCacheKeyForURL("./index.html"), {
37
-
38
- blacklist: [/^\/_/,/\/[^/?]+\.[^/]+$/],
39
- });
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/frontend/build/static/js/2.b1c975ff.chunk.js DELETED
The diff for this file is too large to render. See raw diff
 
streamlit_observable/frontend/build/static/js/2.b1c975ff.chunk.js.LICENSE.txt DELETED
@@ -1,41 +0,0 @@
1
- /*
2
- object-assign
3
- (c) Sindre Sorhus
4
- @license MIT
5
- */
6
-
7
- /** @license React v0.19.1
8
- * scheduler.production.min.js
9
- *
10
- * Copyright (c) Facebook, Inc. and its affiliates.
11
- *
12
- * This source code is licensed under the MIT license found in the
13
- * LICENSE file in the root directory of this source tree.
14
- */
15
-
16
- /** @license React v16.13.1
17
- * react-is.production.min.js
18
- *
19
- * Copyright (c) Facebook, Inc. and its affiliates.
20
- *
21
- * This source code is licensed under the MIT license found in the
22
- * LICENSE file in the root directory of this source tree.
23
- */
24
-
25
- /** @license React v16.14.0
26
- * react-dom.production.min.js
27
- *
28
- * Copyright (c) Facebook, Inc. and its affiliates.
29
- *
30
- * This source code is licensed under the MIT license found in the
31
- * LICENSE file in the root directory of this source tree.
32
- */
33
-
34
- /** @license React v16.14.0
35
- * react.production.min.js
36
- *
37
- * Copyright (c) Facebook, Inc. and its affiliates.
38
- *
39
- * This source code is licensed under the MIT license found in the
40
- * LICENSE file in the root directory of this source tree.
41
- */
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/frontend/build/static/js/2.b1c975ff.chunk.js.map DELETED
The diff for this file is too large to render. See raw diff
 
streamlit_observable/frontend/build/static/js/main.5bdac2e3.chunk.js DELETED
@@ -1,3 +0,0 @@
1
- /*! For license information please see main.5bdac2e3.chunk.js.LICENSE.txt */
2
- (this.webpackJsonpstreamlit_component_template=this.webpackJsonpstreamlit_component_template||[]).push([[0],{18:function(e,_,t){"use strict";t.d(_,"b",(function(){return p})),t.d(_,"c",(function(){return E})),t.d(_,"a",(function(){return h}));var r,n=t(0),a=t(4),o=t(2),s=t(3),i=t(36),l=t.n(i),u=t(12),d=t.n(u),c=t(10),m=t(37),b=t(24),g=function(){function e(_,t,r,a){var o=this;Object(n.a)(this,e),this.dataTable=void 0,this.indexTable=void 0,this.columnsTable=void 0,this.styler=void 0,this.getCell=function(e,_){var t=e<o.headerRows&&_<o.headerColumns,r=e>=o.headerRows&&_<o.headerColumns,n=e<o.headerRows&&_>=o.headerColumns;if(t){var a=["blank"];return _>0&&a.push("level"+e),{type:"blank",classNames:a.join(" "),content:""}}if(n){var s=_-o.headerColumns;return{type:"columns",classNames:["col_heading","level"+e,"col"+s].join(" "),content:o.getContent(o.columnsTable,s,e)}}if(r){var i=e-o.headerRows,l=["row_heading","level"+_,"row"+i];return{type:"index",id:"T_".concat(o.uuid,"level").concat(_,"_row").concat(i),classNames:l.join(" "),content:o.getContent(o.indexTable,i,_)}}var u=e-o.headerRows,d=_-o.headerColumns,c=["data","row"+u,"col"+d],m=o.styler?o.getContent(o.styler.displayValuesTable,u,d):o.getContent(o.dataTable,u,d);return{type:"data",id:"T_".concat(o.uuid,"row").concat(u,"_col").concat(d),classNames:c.join(" "),content:m}},this.getContent=function(e,_,t){var r=e.getColumnAt(t);if(null===r)return"";switch(o.getColumnTypeId(e,t)){case b.b.Timestamp:return o.nanosToDate(r.get(_));default:return r.get(_)}},this.dataTable=b.a.from(_),this.indexTable=b.a.from(t),this.columnsTable=b.a.from(r),this.styler=a?{caption:a.get("caption"),displayValuesTable:b.a.from(a.get("displayValues")),styles:a.get("styles"),uuid:a.get("uuid")}:void 0}return Object(a.a)(e,[{key:"rows",get:function(){return this.indexTable.length+this.columnsTable.numCols}},{key:"columns",get:function(){return this.indexTable.numCols+this.columnsTable.length}},{key:"headerRows",get:function(){return this.rows-this.dataRows}},{key:"headerColumns",get:function(){return this.columns-this.dataColumns}},{key:"dataRows",get:function(){return this.dataTable.length}},{key:"dataColumns",get:function(){return this.dataTable.numCols}},{key:"uuid",get:function(){return this.styler&&this.styler.uuid}},{key:"caption",get:function(){return this.styler&&this.styler.caption}},{key:"styles",get:function(){return this.styler&&this.styler.styles}},{key:"table",get:function(){return this.dataTable}},{key:"index",get:function(){return this.indexTable}},{key:"columnTable",get:function(){return this.columnsTable}},{key:"getColumnTypeId",value:function(e,_){return e.schema.fields[_].type.typeId}},{key:"nanosToDate",value:function(e){return new Date(e/1e6)}}]),e}();!function(e){e.COMPONENT_READY="streamlit:componentReady",e.SET_COMPONENT_VALUE="streamlit:setComponentValue",e.SET_FRAME_HEIGHT="streamlit:setFrameHeight"}(r||(r={}));var h=function e(){Object(n.a)(this,e)};h.API_VERSION=1,h.RENDER_EVENT="streamlit:render",h.events=new m.a,h.registeredMessageListener=!1,h.lastFrameHeight=void 0,h.setComponentReady=function(){h.registeredMessageListener||(window.addEventListener("message",h.onMessageEvent),h.registeredMessageListener=!0),h.sendBackMsg(r.COMPONENT_READY,{apiVersion:h.API_VERSION})},h.setFrameHeight=function(e){void 0===e&&(e=document.body.scrollHeight+10),e!==h.lastFrameHeight&&(h.lastFrameHeight=e,h.sendBackMsg(r.SET_FRAME_HEIGHT,{height:e}))},h.setComponentValue=function(e){h.sendBackMsg(r.SET_COMPONENT_VALUE,{value:e})},h.onMessageEvent=function(e){switch(e.data.type){case h.RENDER_EVENT:h.onRenderMessage(e.data)}},h.onRenderMessage=function(e){var _=e.args;null==_&&(console.error("Got null args in onRenderMessage. This should never happen"),_={});var t=e.dfs&&e.dfs.length>0?h.argsDataframeToObject(e.dfs):{};_=Object(c.a)(Object(c.a)({},_),t);var r={disabled:Boolean(e.disabled),args:_},n=new CustomEvent(h.RENDER_EVENT,{detail:r});h.events.dispatchEvent(n)},h.argsDataframeToObject=function(e){var _=e.map((function(e){var _=e.key,t=e.value;return[_,h.toArrowTable(t)]}));return Object.fromEntries(_)},h.toArrowTable=function(e){var _=e.data,t=_.data,r=_.index,n=_.columns;return new g(t,r,n)},h.sendBackMsg=function(e,_){window.parent.postMessage(Object(c.a)({isStreamlitMessage:!0,type:e},_),"*")};var p=function(e){Object(o.a)(t,e);var _=Object(s.a)(t);function t(){return Object(n.a)(this,t),_.apply(this,arguments)}return Object(a.a)(t,[{key:"componentDidMount",value:function(){h.setFrameHeight()}},{key:"componentDidUpdate",value:function(){h.setFrameHeight()}}]),t}(d.a.PureComponent);function E(e){var _=function(_){Object(o.a)(r,_);var t=Object(s.a)(r);function r(_){var a;return Object(n.a)(this,r),(a=t.call(this,_)).componentDidMount=function(){h.events.addEventListener(h.RENDER_EVENT,a.onRenderEvent),h.setComponentReady()},a.componentDidUpdate=function(e){null!=a.state.componentError&&h.setFrameHeight()},a.componentWillUnmount=function(){h.events.removeEventListener(h.RENDER_EVENT,a.onRenderEvent)},a.onRenderEvent=function(e){var _=e;a.setState({renderData:_.detail})},a.render=function(){return null!=a.state.componentError?d.a.createElement("div",null,d.a.createElement("h1",null,"Component Error"),d.a.createElement("span",null,a.state.componentError.message)):null==a.state.renderData?null:d.a.createElement(e,{width:window.innerWidth,disabled:a.state.renderData.disabled,args:a.state.renderData.args})},a.state={renderData:void 0,componentError:void 0},a}return r}(d.a.PureComponent);return _.getDerivedStateFromError=function(e){return{componentError:e}},l()(_,e)}},35:function(module,__webpack_exports__,__webpack_require__){"use strict";var _mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_slicedToArray__WEBPACK_IMPORTED_MODULE_0__=__webpack_require__(7),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_createForOfIteratorHelper__WEBPACK_IMPORTED_MODULE_1__=__webpack_require__(8),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2__=__webpack_require__(1),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2___default=__webpack_require__.n(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2__),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_asyncToGenerator__WEBPACK_IMPORTED_MODULE_3__=__webpack_require__(5),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_classCallCheck__WEBPACK_IMPORTED_MODULE_4__=__webpack_require__(0),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_createClass__WEBPACK_IMPORTED_MODULE_5__=__webpack_require__(4),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_inherits__WEBPACK_IMPORTED_MODULE_6__=__webpack_require__(2),_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_createSuper__WEBPACK_IMPORTED_MODULE_7__=__webpack_require__(3),react__WEBPACK_IMPORTED_MODULE_8__=__webpack_require__(12),react__WEBPACK_IMPORTED_MODULE_8___default=__webpack_require__.n(react__WEBPACK_IMPORTED_MODULE_8__),_streamlit__WEBPACK_IMPORTED_MODULE_9__=__webpack_require__(18),_observablehq_runtime__WEBPACK_IMPORTED_MODULE_10__=__webpack_require__(30),Observable=function(_StreamlitComponentBa){Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_inherits__WEBPACK_IMPORTED_MODULE_6__.a)(Observable,_StreamlitComponentBa);var _super=Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_createSuper__WEBPACK_IMPORTED_MODULE_7__.a)(Observable);function Observable(){var e;Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_classCallCheck__WEBPACK_IMPORTED_MODULE_4__.a)(this,Observable);for(var _=arguments.length,t=new Array(_),r=0;r<_;r++)t[r]=arguments[r];return(e=_super.call.apply(_super,[this].concat(t))).observeValue={},e.notebookRef=react__WEBPACK_IMPORTED_MODULE_8___default.a.createRef(),e.runtime=null,e.main=null,e.render=function(){return react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{style:{border:"1px solid gray",borderRadius:"4px"}},react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{style:{padding:"9px 12px"}},react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{ref:e.notebookRef})),react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{style:{marginTop:"4px"}},react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{style:{backgroundColor:"#ddd",fontWeight:700,padding:".25rem .5rem",borderRadius:"0 0 4px 4px",gridTemplateColumns:"auto auto",display:"grid"}},react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{style:{textAlign:"left"}},e.props.args.name),react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("div",{style:{textAlign:"right"}},react__WEBPACK_IMPORTED_MODULE_8___default.a.createElement("a",{href:"https://observablehq.com/".concat(e.props.args.notebook),style:{color:"#666"}})))))},e}return Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_createClass__WEBPACK_IMPORTED_MODULE_5__.a)(Observable,[{key:"componentWillUnmount",value:function(){var e;null===(e=this.runtime)||void 0===e||e.dispose()}},{key:"componentDidUpdate",value:function(e){e.args.notebook,this.props.args.notebook,this.redefineCells(this.main,this.props.args.redefine)}},{key:"embedNotebook",value:function(){var _embedNotebook=Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_asyncToGenerator__WEBPACK_IMPORTED_MODULE_3__.a)(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2___default.a.mark((function _callee2(notebook,targets,observe,hide){var _this2=this,targetSet,observeSet,hideSet,_yield$eval,define;return _mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2___default.a.wrap((function _callee2$(_context2){for(;;)switch(_context2.prev=_context2.next){case 0:return this.runtime&&this.runtime.dispose(),targetSet=new Set(targets),observeSet=new Set(observe),hideSet=new Set(hide),this.runtime=new _observablehq_runtime__WEBPACK_IMPORTED_MODULE_10__.b,_context2.next=7,eval('import("https://api.observablehq.com/'.concat(notebook,'.js?v=3")'));case 7:_yield$eval=_context2.sent,define=_yield$eval.default,this.main=this.runtime.module(define,(function(e){var _;if(observeSet.has(e)&&!targetSet.has(e)){var t=_this2.observeValue;return{fulfilled:function(_){t[e]=_,_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setComponentValue(t)}}}if(!(targetSet.size>0)||targetSet.has(e)){if(hideSet.has(e))return!0;var r=document.createElement("div");null===(_=_this2.notebookRef.current)||void 0===_||_.appendChild(r);var n=new _observablehq_runtime__WEBPACK_IMPORTED_MODULE_10__.a(r);return r.addEventListener("input",(function(e){_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setFrameHeight()})),{pending:function(){n.pending(),_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setFrameHeight()},fulfilled:function(e){n.fulfilled(e),_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setFrameHeight()},rejected:function(e){n.rejected(e),_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setFrameHeight()}}}})),observeSet.size>0&&Promise.all(Array.from(observeSet).map(function(){var e=Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_asyncToGenerator__WEBPACK_IMPORTED_MODULE_3__.a)(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2___default.a.mark((function e(_){return _mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_regenerator__WEBPACK_IMPORTED_MODULE_2___default.a.wrap((function(e){for(;;)switch(e.prev=e.next){case 0:return e.t0=_,e.next=3,_this2.main.value(_);case 3:return e.t1=e.sent,e.abrupt("return",[e.t0,e.t1]);case 5:case"end":return e.stop()}}),e)})));return function(_){return e.apply(this,arguments)}}())).then((function(e){var _,t=Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_createForOfIteratorHelper__WEBPACK_IMPORTED_MODULE_1__.a)(e);try{for(t.s();!(_=t.n()).done;){var r=Object(_mnt_storage_Documents_hugging_face_colaborative_hub_training_demo_neurips_training_transformers_together_dashboard_streamlit_observable_frontend_node_modules_babel_preset_react_app_node_modules_babel_runtime_helpers_esm_slicedToArray__WEBPACK_IMPORTED_MODULE_0__.a)(_.value,2),n=r[0],a=r[1];_this2.observeValue[n]=a}}catch(o){t.e(o)}finally{t.f()}_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setComponentValue(_this2.observeValue)}));case 11:case"end":return _context2.stop()}}),_callee2,this)})));function embedNotebook(e,_,t,r){return _embedNotebook.apply(this,arguments)}return embedNotebook}()},{key:"redefineCells",value:function(e){var _=arguments.length>1&&void 0!==arguments[1]?arguments[1]:{};for(var t in _)e.redefine(t,_[t])}},{key:"componentDidMount",value:function(){var e=this,_=this.props.args,t=_.notebook,r=_.targets,n=void 0===r?[]:r,a=_.observe,o=void 0===a?[]:a,s=_.redefine,i=void 0===s?{}:s,l=_.hide,u=void 0===l?[]:l;_streamlit__WEBPACK_IMPORTED_MODULE_9__.a.setComponentValue(this.observeValue),this.embedNotebook(t,n,o,u).then((function(){e.redefineCells(e.main,i)}))}}]),Observable}(_streamlit__WEBPACK_IMPORTED_MODULE_9__.b);__webpack_exports__.a=Object(_streamlit__WEBPACK_IMPORTED_MODULE_9__.c)(Observable)},40:function(e,_,t){e.exports=t(41)},41:function(e,_,t){"use strict";t.r(_);var r=t(12),n=t.n(r),a=t(34),o=t.n(a),s=t(35);o.a.render(n.a.createElement(n.a.StrictMode,null,n.a.createElement(s.a,null)),document.getElementById("root"))}},[[40,1,2]]]);
3
- //# sourceMappingURL=main.5bdac2e3.chunk.js.map
 
 
 
 
streamlit_observable/frontend/build/static/js/main.5bdac2e3.chunk.js.LICENSE.txt DELETED
@@ -1,16 +0,0 @@
1
- /**
2
- * @license
3
- * Copyright 2018-2020 Streamlit Inc.
4
- *
5
- * Licensed under the Apache License, Version 2.0 (the "License");
6
- * you may not use this file except in compliance with the License.
7
- * You may obtain a copy of the License at
8
- *
9
- * http://www.apache.org/licenses/LICENSE-2.0
10
- *
11
- * Unless required by applicable law or agreed to in writing, software
12
- * distributed under the License is distributed on an "AS IS" BASIS,
13
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14
- * See the License for the specific language governing permissions and
15
- * limitations under the License.
16
- */
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
streamlit_observable/frontend/build/static/js/main.5bdac2e3.chunk.js.map DELETED
@@ -1 +0,0 @@
1
- {"version":3,"sources":["streamlit/streamlit.ts","streamlit/ArrowTable.ts","streamlit/StreamlitReact.tsx","Observable.tsx","index.tsx"],"names":["ComponentMessageType","ArrowTable","dataBuffer","indexBuffer","columnsBuffer","styler","dataTable","indexTable","columnsTable","getCell","rowIndex","columnIndex","isBlankCell","headerRows","headerColumns","isIndexCell","isColumnsCell","classNames","push","type","join","content","dataColumnIndex","getContent","dataRowIndex","id","uuid","displayValuesTable","table","column","getColumnAt","getColumnTypeId","Type","Timestamp","nanosToDate","get","this","Table","from","caption","styles","undefined","length","numCols","rows","dataRows","columns","dataColumns","schema","fields","typeId","nanos","Date","Streamlit","API_VERSION","RENDER_EVENT","events","EventTarget","registeredMessageListener","lastFrameHeight","setComponentReady","window","addEventListener","onMessageEvent","sendBackMsg","COMPONENT_READY","apiVersion","setFrameHeight","height","document","body","scrollHeight","SET_FRAME_HEIGHT","setComponentValue","value","SET_COMPONENT_VALUE","event","data","onRenderMessage","args","console","error","dataframeArgs","argsDataframeToObject","eventData","disabled","Boolean","CustomEvent","detail","dispatchEvent","argsDataframe","argsDataframeArrow","map","key","toArrowTable","Object","fromEntries","df","index","parent","postMessage","isStreamlitMessage","StreamlitComponentBase","React","PureComponent","withStreamlitConnection","WrappedComponent","ComponentWrapper","props","componentDidMount","onRenderEvent","componentDidUpdate","prevProps","state","componentError","componentWillUnmount","removeEventListener","renderEvent","setState","renderData","render","message","width","innerWidth","getDerivedStateFromError","hoistNonReactStatics","Observable","observeValue","notebookRef","createRef","runtime","main","style","border","borderRadius","padding","ref","marginTop","backgroundColor","fontWeight","gridTemplateColumns","display","textAlign","name","href","notebook","color","dispose","redefineCells","redefine","targets","observe","hide","a","targetSet","Set","observeSet","hideSet","Runtime","eval","define","default","module","has","fulfilled","size","el","createElement","current","appendChild","i","Inspector","e","pending","rejected","Promise","all","Array","then","initial","cell","embedNotebook","ReactDOM","StrictMode","getElementById"],"mappings":";uPA4BKA,E,sFCoBQC,EAAb,WAME,WACEC,EACAC,EACAC,EACAC,GACC,IAAD,gCAVeC,eAUf,OATeC,gBASf,OAReC,kBAQf,OAPeH,YAOf,OA8DKI,QAAU,SAACC,EAAkBC,GAClC,IAAMC,EACJF,EAAW,EAAKG,YAAcF,EAAc,EAAKG,cAC7CC,EACJL,GAAY,EAAKG,YAAcF,EAAc,EAAKG,cAC9CE,EACJN,EAAW,EAAKG,YAAcF,GAAe,EAAKG,cAEpD,GAAIF,EAAa,CACf,IAAMK,EAAa,CAAC,SAKpB,OAJIN,EAAc,GAChBM,EAAWC,KAAK,QAAUR,GAGrB,CACLS,KAAM,QACNF,WAAYA,EAAWG,KAAK,KAC5BC,QAAS,IAEN,GAAIL,EAAe,CACxB,IAAMM,EAAkBX,EAAc,EAAKG,cAO3C,MAAO,CACLK,KAAM,UACNF,WARiB,CACjB,cACA,QAAUP,EACV,MAAQY,GAKeF,KAAK,KAC5BC,QAAS,EAAKE,WAAW,EAAKf,aAAcc,EAAiBZ,IAE1D,GAAIK,EAAa,CACtB,IAAMS,EAAed,EAAW,EAAKG,WAC/BI,EAAa,CACjB,cACA,QAAUN,EACV,MAAQa,GAGV,MAAO,CACLL,KAAM,QACNM,GAAG,KAAD,OAAO,EAAKC,KAAZ,gBAAwBf,EAAxB,eAA0Ca,GAC5CP,WAAYA,EAAWG,KAAK,KAC5BC,QAAS,EAAKE,WAAW,EAAKhB,WAAYiB,EAAcb,IAG1D,IAAMa,EAAed,EAAW,EAAKG,WAC/BS,EAAkBX,EAAc,EAAKG,cACrCG,EAAa,CACjB,OACA,MAAQO,EACR,MAAQF,GAEJD,EAAU,EAAKhB,OACjB,EAAKkB,WACH,EAAKlB,OAAOsB,mBACZH,EACAF,GAEF,EAAKC,WAAW,EAAKjB,UAAWkB,EAAcF,GAElD,MAAO,CACLH,KAAM,OACNM,GAAG,KAAD,OAAO,EAAKC,KAAZ,cAAsBF,EAAtB,eAAyCF,GAC3CL,WAAYA,EAAWG,KAAK,KAC5BC,YAhIJ,KAqIKE,WAAa,SAClBK,EACAlB,EACAC,GAEA,IAAMkB,EAASD,EAAME,YAAYnB,GACjC,GAAe,OAAXkB,EACF,MAAO,GAIT,OADqB,EAAKE,gBAAgBH,EAAOjB,IAE/C,KAAKqB,IAAKC,UACR,OAAO,EAAKC,YAAYL,EAAOM,IAAIzB,IAErC,QACE,OAAOmB,EAAOM,IAAIzB,KApJtB0B,KAAK9B,UAAY+B,IAAMC,KAAKpC,GAC5BkC,KAAK7B,WAAa8B,IAAMC,KAAKnC,GAC7BiC,KAAK5B,aAAe6B,IAAMC,KAAKlC,GAC/BgC,KAAK/B,OAASA,EACV,CACEkC,QAASlC,EAAO8B,IAAI,WACpBR,mBAAoBU,IAAMC,KAAKjC,EAAO8B,IAAI,kBAC1CK,OAAQnC,EAAO8B,IAAI,UACnBT,KAAMrB,EAAO8B,IAAI,cAEnBM,EAtBR,sCAyBE,WACE,OAAOL,KAAK7B,WAAWmC,OAASN,KAAK5B,aAAamC,UA1BtD,mBA6BE,WACE,OAAOP,KAAK7B,WAAWoC,QAAUP,KAAK5B,aAAakC,SA9BvD,sBAiCE,WACE,OAAON,KAAKQ,KAAOR,KAAKS,WAlC5B,yBAqCE,WACE,OAAOT,KAAKU,QAAUV,KAAKW,cAtC/B,oBAyCE,WACE,OAAOX,KAAK9B,UAAUoC,SA1C1B,uBA6CE,WACE,OAAON,KAAK9B,UAAUqC,UA9C1B,gBAiDE,WACE,OAAOP,KAAK/B,QAAU+B,KAAK/B,OAAOqB,OAlDtC,mBAqDE,WACE,OAAOU,KAAK/B,QAAU+B,KAAK/B,OAAOkC,UAtDtC,kBAyDE,WACE,OAAOH,KAAK/B,QAAU+B,KAAK/B,OAAOmC,SA1DtC,iBA6DE,WACE,OAAOJ,KAAK9B,YA9DhB,iBAiEE,WACE,OAAO8B,KAAK7B,aAlEhB,uBAqEE,WACE,OAAO6B,KAAK5B,eAtEhB,6BAwKE,SAAwBoB,EAAcjB,GACpC,OAAOiB,EAAMoB,OAAOC,OAAOtC,GAAaQ,KAAK+B,SAzKjD,yBA4KE,SAAoBC,GAClB,OAAO,IAAIC,KAAKD,EAAQ,SA7K5B,M,SDpBKnD,K,2CAAAA,E,kDAAAA,E,6CAAAA,M,KAsBE,IAAMqD,EAAb,kCAAaA,EAKYC,YAAc,EAL1BD,EAOYE,aAAe,mBAP3BF,EAUYG,OAAS,IAAIC,IAVzBJ,EAYIK,2BAA4B,EAZhCL,EAaIM,qB,EAbJN,EAoBGO,kBAAoB,WAC3BP,EAAUK,4BAEbG,OAAOC,iBAAiB,UAAWT,EAAUU,gBAC7CV,EAAUK,2BAA4B,GAGxCL,EAAUW,YAAYhE,EAAqBiE,gBAAiB,CAC1DC,WAAYb,EAAUC,eA5BfD,EAqCGc,eAAiB,SAACC,QACf3B,IAAX2B,IAIFA,EAASC,SAASC,KAAKC,aAAe,IAGpCH,IAAWf,EAAUM,kBAKzBN,EAAUM,gBAAkBS,EAC5Bf,EAAUW,YAAYhE,EAAqBwE,iBAAkB,CAAEJ,aAnDtDf,EAqEGoB,kBAAoB,SAACC,GACjCrB,EAAUW,YAAYhE,EAAqB2E,oBAAqB,CAAED,WAtEzDrB,EA0EIU,eAAiB,SAACa,GAE/B,OADaA,EAAMC,KAAN,MAEX,KAAKxB,EAAUE,aACbF,EAAUyB,gBAAgBF,EAAMC,QA9E3BxB,EAuFIyB,gBAAkB,SAACD,GAChC,IAAIE,EAAOF,EAAI,KACH,MAARE,IACFC,QAAQC,MAAR,8DAGAF,EAAO,IAIT,IAAMG,EACJL,EAAI,KAAWA,EAAI,IAAQnC,OAAS,EAChCW,EAAU8B,sBAAsBN,EAAI,KACpC,GAENE,EAAI,2BACCA,GACAG,GAGL,IAGME,EAAY,CAAEC,SAHHC,QAAQT,EAAI,UAGCE,QACxBH,EAAQ,IAAIW,YAAwBlC,EAAUE,aAAc,CAChEiC,OAAQJ,IAEV/B,EAAUG,OAAOiC,cAAcb,IAlHtBvB,EAqHI8B,sBAAwB,SACrCO,GAEA,IAAMC,EAAqBD,EAAcE,KACvC,gBAAGC,EAAH,EAAGA,IAAKnB,EAAR,EAAQA,MAAR,MAAmC,CAACmB,EAAKxC,EAAUyC,aAAapB,OAElE,OAAOqB,OAAOC,YAAYL,IA3HjBtC,EA8HIyC,aAAe,SAACG,GAC7B,MAAiCA,EAAGpB,KAA5BA,EAAR,EAAQA,KAAMqB,EAAd,EAAcA,MAAOpD,EAArB,EAAqBA,QACrB,OAAO,IAAI7C,EAAW4E,EAAMqB,EAAOpD,IAhI1BO,EAoIIW,YAAc,SAAC7C,EAAc0D,GAC1ChB,OAAOsC,OAAOC,YAAd,aAEIC,oBAAoB,EACpBlF,KAAMA,GACH0D,GAEL,ME/JC,IAAMyB,EAAb,uKAIE,WAGEjD,EAAUc,mBAPd,gCAUE,WAEEd,EAAUc,qBAZd,GAAoDoC,IAAMC,eAqBnD,SAASC,EACdC,GACsB,IAQhBC,EARe,kDAYnB,WAAmBC,GAAsB,IAAD,8BACtC,cAAMA,IAkBDC,kBAAoB,WAGzBxD,EAAUG,OAAOM,iBACfT,EAAUE,aACV,EAAKuD,eAEPzD,EAAUO,qBA1B4B,EA6BjCmD,mBAAqB,SAACC,GAKM,MAA7B,EAAKC,MAAMC,gBACb7D,EAAUc,kBAnC0B,EAuCjCgD,qBAAuB,WAC5B9D,EAAUG,OAAO4D,oBACf/D,EAAUE,aACV,EAAKuD,gBA1C+B,EAmDhCA,cAAgB,SAAClC,GAEvB,IAAMyC,EAAczC,EACpB,EAAK0C,SAAS,CAAEC,WAAYF,EAAY7B,UAtDF,EAyDjCgC,OAAS,WAEd,OAAiC,MAA7B,EAAKP,MAAMC,eAEX,6BACE,+CACA,8BAAO,EAAKD,MAAMC,eAAeO,UAMV,MAAzB,EAAKR,MAAMM,WACN,KAIP,kBAACb,EAAD,CACEgB,MAAO7D,OAAO8D,WACdtC,SAAU,EAAK4B,MAAMM,WAAWlC,SAChCN,KAAM,EAAKkC,MAAMM,WAAWxC,QA3EhC,EAAKkC,MAAQ,CACXM,gBAAY9E,EACZyE,oBAAgBzE,GAJoB,EAZrB,UAQU8D,IAAMC,eAuFrC,OAvFMG,EAiBUiB,yBAA2B,SACvC3C,GAEA,MAAO,CAAEiC,eAAgBjC,IAmEtB4C,IAAqBlB,EAAkBD,K,6nGC5I1CoB,W,mjCACGC,aAAe,G,EACdC,YAAczB,6CAAM0B,Y,EACpBC,QAAe,K,EACfC,KAAY,K,EAqFbX,OAAS,WACd,OACE,kEAAKY,MAAO,CAAEC,OAAQ,iBAAkBC,aAAc,QACpD,kEAAKF,MAAO,CAAEG,QAAS,aACrB,kEAAKC,IAAK,EAAKR,eAEjB,kEAAKI,MAAO,CAAEK,UAAW,QAEvB,kEAAKL,MAAO,CACVM,gBAAiB,OACjBC,WAAY,IACZJ,QAAS,eACTD,aAAc,cACdM,oBAAqB,YACrBC,QAAQ,SAER,kEAAKT,MAAO,CAACU,UAAU,SAAU,EAAKlC,MAAM7B,KAAKgE,MACjD,kEAAKX,MAAO,CAACU,UAAU,UACvB,gEAAGE,KAAI,mCAA8B,EAAKpC,MAAM7B,KAAKkE,UAAYb,MAAO,CAAEc,MAAO,e,uUArG3F,WAAwB,IAAD,EACrB,UAAA9G,KAAK8F,eAAL,SAAciB,Y,gCAGhB,SAA0BnC,GACGA,EAAnBjC,KACKkE,SAAa7G,KAAKwE,MAAM7B,KAAKkE,SAG1C7G,KAAKgH,cAAchH,KAAK+F,KAAM/F,KAAKwE,MAAM7B,KAAKsE,Y,wlBAGhD,kBAAoBJ,SAAkBK,QAAmBC,QAAmBC,MAA5E,yUAAAC,EAAA,+FACMrH,KAAK8F,SACP9F,KAAK8F,QAAQiB,UAETO,UAAY,IAAIC,IAAIL,SACpBM,WAAa,IAAID,IAAIJ,SACrBM,QAAU,IAAIF,IAAIH,MACxBpH,KAAK8F,QAAU,IAAI4B,sDAPrB,iBAQoCC,KAAK,wCAAD,OAAyCd,SAAzC,cARxC,kCAQmBe,OARnB,YAQUC,QACR7H,KAAK+F,KAAO/F,KAAK8F,QAAQgC,OAAOF,QAAQ,SAACjB,GAAkB,IAAD,EACxD,GAAIa,WAAWO,IAAIpB,KAAUW,UAAUS,IAAIpB,GAAO,CAChD,IAAMhB,EAAe,OAAKA,aAC1B,MAAO,CACLqC,UAAW,SAAC1F,GAEVqD,EAAagB,GAAQrE,EAErBrB,0CAAUoB,kBAAkBsD,KAIlC,KAAI2B,UAAUW,KAAO,IAAMX,UAAUS,IAAIpB,GAAzC,CACA,GAAGc,QAAQM,IAAIpB,GAAO,OAAO,EAC7B,IAAMuB,EAAKjG,SAASkG,cAAc,OAClC,iBAAKvC,YAAYwC,eAAjB,SAA0BC,YAAYH,GAEtC,IAAMI,EAAI,IAAIC,sDAAUL,GAIxB,OAHAA,EAAGxG,iBAAiB,SAAS,SAAA8G,GAC3BvH,0CAAUc,oBAEL,CACL0G,QADK,WAEHH,EAAEG,UACFxH,0CAAUc,kBAEZiG,UALK,SAKK1F,GACRgG,EAAEN,UAAU1F,GACZrB,0CAAUc,kBAEZ2G,SATK,SASI7F,GACPyF,EAAEI,SAAS7F,GACX5B,0CAAUc,uBAIZyF,WAAWS,KAAO,GACpBU,QAAQC,IAAIC,MAAM3I,KAAKsH,YAAYhE,IAAvB,gjBAA2B,WAAMmD,GAAN,yQAAAU,EAAA,kEAAeV,EAAf,SAA2B,OAAKZ,KAAKzD,MAAMqE,GAA3C,oGAA3B,wDAA+EmC,MAAK,SAAAC,GAAY,IAAD,kSAC7EA,GAD6E,IACzG,2BAAqC,CAAC,IAAD,+RAAzBpC,EAAyB,KAAnBrE,EAAmB,KAEnC,OAAKqD,aAAagB,GAAQrE,GAH6E,8BAKzGrB,0CAAUoB,kBAAkB,OAAKsD,iBAnDvC,gE,gIAwDA,SAAcI,GAA2B,IAAhBkB,EAAe,uDAAJ,GAClC,IAAK,IAAI+B,KAAQ/B,EAEflB,EAAKkB,SAAS+B,EAAM/B,EAAS+B,M,+BAGjC,WAAqB,IAAD,OAClB,EAAyEhJ,KAAKwE,MAAM7B,KAA5EkE,EAAR,EAAQA,SAAR,IAAkBK,eAAlB,MAA4B,GAA5B,MAAgCC,eAAhC,MAA0C,GAA1C,MAA8CF,gBAA9C,MAAyD,GAAzD,MAA8DG,YAA9D,MAAmE,GAAnE,EACAnG,0CAAUoB,kBAAkBrC,KAAK2F,cACjC3F,KAAKiJ,cAAcpC,EAAUK,EAASC,EAASC,GAAM0B,MAAK,WACxD,EAAK9B,cAAc,EAAKjB,KAAMkB,U,YApFX/C,2CAoHVG,wEAAwBqB,a,oEC5HvC,qDAIAwD,IAAS9D,OACP,kBAAC,IAAM+D,WAAP,KACE,kBAAC,IAAD,OAEFlH,SAASmH,eAAe,W","file":"static/js/main.5bdac2e3.chunk.js","sourcesContent":["/**\n * @license\n * Copyright 2018-2020 Streamlit Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n// Safari doesn't support the EventTarget class, so we use a shim.\nimport { EventTarget } from \"event-target-shim\"\nimport { ArrowDataframeProto, ArrowTable } from \"./ArrowTable\"\n\n/** Data sent in the custom Streamlit render event. */\nexport interface RenderData {\n args: any\n disabled: boolean\n}\n\n/** Messages from Component -> Streamlit */\nenum ComponentMessageType {\n // A component sends this message when it's ready to receive messages\n // from Streamlit. Streamlit won't send any messages until it gets this.\n // Data: { apiVersion: number }\n COMPONENT_READY = \"streamlit:componentReady\",\n\n // The component has a new widget value. Send it back to Streamlit, which\n // will then re-run the app.\n // Data: { value: any }\n SET_COMPONENT_VALUE = \"streamlit:setComponentValue\",\n\n // The component has a new height for its iframe.\n // Data: { height: number }\n SET_FRAME_HEIGHT = \"streamlit:setFrameHeight\",\n}\n\n/**\n * Streamlit communication API.\n *\n * Components can send data to Streamlit via the functions defined here,\n * and receive data from Streamlit via the `events` property.\n */\nexport class Streamlit {\n /**\n * The Streamlit component API version we're targetting.\n * There's currently only 1!\n */\n public static readonly API_VERSION = 1\n\n public static readonly RENDER_EVENT = \"streamlit:render\"\n\n /** Dispatches events received from Streamlit. */\n public static readonly events = new EventTarget()\n\n private static registeredMessageListener = false\n private static lastFrameHeight?: number\n\n /**\n * Tell Streamlit that the component is ready to start receiving data.\n * Streamlit will defer emitting RENDER events until it receives the\n * COMPONENT_READY message.\n */\n public static setComponentReady = (): void => {\n if (!Streamlit.registeredMessageListener) {\n // Register for message events if we haven't already\n window.addEventListener(\"message\", Streamlit.onMessageEvent)\n Streamlit.registeredMessageListener = true\n }\n\n Streamlit.sendBackMsg(ComponentMessageType.COMPONENT_READY, {\n apiVersion: Streamlit.API_VERSION,\n })\n }\n\n /**\n * Report the component's height to Streamlit.\n * This should be called every time the component changes its DOM - that is,\n * when it's first loaded, and any time it updates.\n */\n public static setFrameHeight = (height?: number): void => {\n if (height === undefined) {\n // `height` is optional. If undefined, it defaults to scrollHeight,\n // which is the entire height of the element minus its border,\n // scrollbar, and margin.\n height = document.body.scrollHeight + 10;\n }\n\n if (height === Streamlit.lastFrameHeight) {\n // Don't bother updating if our height hasn't changed.\n return\n }\n\n Streamlit.lastFrameHeight = height\n Streamlit.sendBackMsg(ComponentMessageType.SET_FRAME_HEIGHT, { height })\n }\n\n /**\n * Set the component's value. This value will be returned to the Python\n * script, and the script will be re-run.\n *\n * For example:\n *\n * JavaScript:\n * Streamlit.setComponentValue(\"ahoy!\")\n *\n * Python:\n * value = st.my_component(...)\n * st.write(value) # -> \"ahoy!\"\n *\n * The value must be serializable into JSON.\n */\n public static setComponentValue = (value: any): void => {\n Streamlit.sendBackMsg(ComponentMessageType.SET_COMPONENT_VALUE, { value })\n }\n\n /** Receive a ForwardMsg from the Streamlit app */\n private static onMessageEvent = (event: MessageEvent): void => {\n const type = event.data[\"type\"]\n switch (type) {\n case Streamlit.RENDER_EVENT:\n Streamlit.onRenderMessage(event.data)\n break\n }\n }\n\n /**\n * Handle an untyped Streamlit render event and redispatch it as a\n * StreamlitRenderEvent.\n */\n private static onRenderMessage = (data: any): void => {\n let args = data[\"args\"]\n if (args == null) {\n console.error(\n `Got null args in onRenderMessage. This should never happen`\n )\n args = {}\n }\n\n // Parse our dataframe arguments with arrow, and merge them into our args dict\n const dataframeArgs =\n data[\"dfs\"] && data[\"dfs\"].length > 0\n ? Streamlit.argsDataframeToObject(data[\"dfs\"])\n : {}\n\n args = {\n ...args,\n ...dataframeArgs,\n }\n\n const disabled = Boolean(data[\"disabled\"])\n\n // Dispatch a render event!\n const eventData = { disabled, args }\n const event = new CustomEvent<RenderData>(Streamlit.RENDER_EVENT, {\n detail: eventData,\n })\n Streamlit.events.dispatchEvent(event)\n }\n\n private static argsDataframeToObject = (\n argsDataframe: ArgsDataframe[]\n ): object => {\n const argsDataframeArrow = argsDataframe.map(\n ({ key, value }: ArgsDataframe) => [key, Streamlit.toArrowTable(value)]\n )\n return Object.fromEntries(argsDataframeArrow)\n }\n\n private static toArrowTable = (df: ArrowDataframeProto): ArrowTable => {\n const { data, index, columns } = df.data\n return new ArrowTable(data, index, columns)\n }\n\n /** Post a message to the Streamlit app. */\n private static sendBackMsg = (type: string, data?: any): void => {\n window.parent.postMessage(\n {\n isStreamlitMessage: true,\n type: type,\n ...data,\n },\n \"*\"\n )\n }\n}\n\ninterface ArgsDataframe {\n key: string\n value: ArrowDataframeProto\n}\n","/**\n * @license\n * Copyright 2018-2019 Streamlit Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n * http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\nimport { Table, Type } from \"apache-arrow\"\n\ntype CellType = \"blank\" | \"index\" | \"columns\" | \"data\"\n\nexport interface ArrowDataframeProto {\n data: ArrowTableProto\n height: string\n width: string\n}\n\nexport interface ArrowTableProto {\n data: Uint8Array\n index: Uint8Array\n columns: Uint8Array\n styler: Styler\n}\n\ninterface Cell {\n classNames: string\n content: string\n id?: string\n type: CellType\n}\n\ninterface Styler {\n caption?: string\n displayValuesTable: Table\n styles?: string\n uuid: string\n}\n\nexport class ArrowTable {\n private readonly dataTable: Table\n private readonly indexTable: Table\n private readonly columnsTable: Table\n private readonly styler?: Styler\n\n constructor(\n dataBuffer: Uint8Array,\n indexBuffer: Uint8Array,\n columnsBuffer: Uint8Array,\n styler?: any\n ) {\n this.dataTable = Table.from(dataBuffer)\n this.indexTable = Table.from(indexBuffer)\n this.columnsTable = Table.from(columnsBuffer)\n this.styler = styler\n ? {\n caption: styler.get(\"caption\"),\n displayValuesTable: Table.from(styler.get(\"displayValues\")),\n styles: styler.get(\"styles\"),\n uuid: styler.get(\"uuid\"),\n }\n : undefined\n }\n\n get rows(): number {\n return this.indexTable.length + this.columnsTable.numCols\n }\n\n get columns(): number {\n return this.indexTable.numCols + this.columnsTable.length\n }\n\n get headerRows(): number {\n return this.rows - this.dataRows\n }\n\n get headerColumns(): number {\n return this.columns - this.dataColumns\n }\n\n get dataRows(): number {\n return this.dataTable.length\n }\n\n get dataColumns(): number {\n return this.dataTable.numCols\n }\n\n get uuid(): string | undefined {\n return this.styler && this.styler.uuid\n }\n\n get caption(): string | undefined {\n return this.styler && this.styler.caption\n }\n\n get styles(): string | undefined {\n return this.styler && this.styler.styles\n }\n\n get table(): Table {\n return this.dataTable\n }\n\n get index(): Table {\n return this.indexTable\n }\n\n get columnTable(): Table {\n return this.columnsTable\n }\n\n public getCell = (rowIndex: number, columnIndex: number): Cell => {\n const isBlankCell =\n rowIndex < this.headerRows && columnIndex < this.headerColumns\n const isIndexCell =\n rowIndex >= this.headerRows && columnIndex < this.headerColumns\n const isColumnsCell =\n rowIndex < this.headerRows && columnIndex >= this.headerColumns\n\n if (isBlankCell) {\n const classNames = [\"blank\"]\n if (columnIndex > 0) {\n classNames.push(\"level\" + rowIndex)\n }\n\n return {\n type: \"blank\",\n classNames: classNames.join(\" \"),\n content: \"\",\n }\n } else if (isColumnsCell) {\n const dataColumnIndex = columnIndex - this.headerColumns\n const classNames = [\n \"col_heading\",\n \"level\" + rowIndex,\n \"col\" + dataColumnIndex,\n ]\n\n return {\n type: \"columns\",\n classNames: classNames.join(\" \"),\n content: this.getContent(this.columnsTable, dataColumnIndex, rowIndex),\n }\n } else if (isIndexCell) {\n const dataRowIndex = rowIndex - this.headerRows\n const classNames = [\n \"row_heading\",\n \"level\" + columnIndex,\n \"row\" + dataRowIndex,\n ]\n\n return {\n type: \"index\",\n id: `T_${this.uuid}level${columnIndex}_row${dataRowIndex}`,\n classNames: classNames.join(\" \"),\n content: this.getContent(this.indexTable, dataRowIndex, columnIndex),\n }\n } else {\n const dataRowIndex = rowIndex - this.headerRows\n const dataColumnIndex = columnIndex - this.headerColumns\n const classNames = [\n \"data\",\n \"row\" + dataRowIndex,\n \"col\" + dataColumnIndex,\n ]\n const content = this.styler\n ? this.getContent(\n this.styler.displayValuesTable,\n dataRowIndex,\n dataColumnIndex\n )\n : this.getContent(this.dataTable, dataRowIndex, dataColumnIndex)\n\n return {\n type: \"data\",\n id: `T_${this.uuid}row${dataRowIndex}_col${dataColumnIndex}`,\n classNames: classNames.join(\" \"),\n content,\n }\n }\n }\n\n public getContent = (\n table: Table,\n rowIndex: number,\n columnIndex: number\n ): any => {\n const column = table.getColumnAt(columnIndex)\n if (column === null) {\n return \"\"\n }\n\n const columnTypeId = this.getColumnTypeId(table, columnIndex)\n switch (columnTypeId) {\n case Type.Timestamp: {\n return this.nanosToDate(column.get(rowIndex))\n }\n default: {\n return column.get(rowIndex)\n }\n }\n }\n\n /**\n * Returns apache-arrow specific typeId of column.\n */\n private getColumnTypeId(table: Table, columnIndex: number): Type {\n return table.schema.fields[columnIndex].type.typeId\n }\n\n private nanosToDate(nanos: number): Date {\n return new Date(nanos / 1e6)\n }\n}\n","import hoistNonReactStatics from \"hoist-non-react-statics\"\nimport React, { ReactNode } from \"react\"\nimport { RenderData, Streamlit } from \"./streamlit\"\n\n/**\n * Props passed to custom Streamlit components.\n */\nexport interface ComponentProps {\n /** Named dictionary of arguments passed from Python. */\n args: any\n\n /** The component's width. */\n width: number\n\n /**\n * True if the component should be disabled.\n * All components get disabled while the app is being re-run,\n * and become re-enabled when the re-run has finished.\n */\n disabled: boolean\n}\n\n/**\n * Optional Streamlit React-based component base class.\n *\n * You are not required to extend this base class to create a Streamlit\n * component. If you decide not to extend it, you should implement the\n * `componentDidMount` and `componentDidUpdate` functions in your own class,\n * so that your plugin properly resizes.\n */\nexport class StreamlitComponentBase<S = {}> extends React.PureComponent<\n ComponentProps,\n S\n > {\n public componentDidMount(): void {\n // After we're rendered for the first time, tell Streamlit that our height\n // has changed.\n Streamlit.setFrameHeight()\n }\n\n public componentDidUpdate(): void {\n // After we're updated, tell Streamlit that our height may have changed.\n Streamlit.setFrameHeight()\n }\n}\n\n/**\n * Wrapper for React-based Streamlit components.\n *\n * Bootstraps the communication interface between Streamlit and the component.\n */\nexport function withStreamlitConnection(\n WrappedComponent: React.ComponentType<ComponentProps>\n): React.ComponentType {\n interface WrapperProps { }\n\n interface WrapperState {\n renderData?: RenderData\n componentError?: Error\n }\n\n class ComponentWrapper extends React.PureComponent<\n WrapperProps,\n WrapperState\n > {\n public constructor(props: WrapperProps) {\n super(props)\n this.state = {\n renderData: undefined,\n componentError: undefined,\n }\n }\n\n /**\n * Error boundary function. This will be called if our wrapped\n * component throws an error. We store the caught error in our state,\n * and display it in the next render().\n */\n public static getDerivedStateFromError = (\n error: Error\n ): Partial<WrapperState> => {\n return { componentError: error }\n }\n\n public componentDidMount = (): void => {\n // Set up event listeners, and signal to Streamlit that we're ready.\n // We won't render the component until we receive the first RENDER_EVENT.\n Streamlit.events.addEventListener(\n Streamlit.RENDER_EVENT,\n this.onRenderEvent\n )\n Streamlit.setComponentReady()\n }\n\n public componentDidUpdate = (prevProps: any): void => {\n // If our child threw an error, we display it in render(). In this\n // case, the child won't be mounted and therefore won't call\n // `setFrameHeight` on its own. We do it here so that the rendered\n // error will be visible.\n if (this.state.componentError != null) {\n Streamlit.setFrameHeight()\n }\n }\n\n public componentWillUnmount = (): void => {\n Streamlit.events.removeEventListener(\n Streamlit.RENDER_EVENT,\n this.onRenderEvent\n )\n }\n\n /**\n * Streamlit is telling this component to redraw.\n * We save the render data in State, so that it can be passed to the\n * component in our own render() function.\n */\n private onRenderEvent = (event: Event): void => {\n // Update our state with the newest render data\n const renderEvent = event as CustomEvent<RenderData>\n this.setState({ renderData: renderEvent.detail })\n }\n\n public render = (): ReactNode => {\n // If our wrapped component threw an error, display it.\n if (this.state.componentError != null) {\n return (\n <div>\n <h1>Component Error</h1>\n <span>{this.state.componentError.message}</span>\n </div>\n )\n }\n\n // Don't render until we've gotten our first RENDER_EVENT from Streamlit.\n if (this.state.renderData == null) {\n return null\n }\n\n return (\n <WrappedComponent\n width={window.innerWidth}\n disabled={this.state.renderData.disabled}\n args={this.state.renderData.args}\n />\n )\n }\n }\n\n return hoistNonReactStatics(ComponentWrapper, WrappedComponent)\n}\n","import React, { ReactNode } from \"react\"\nimport {\n withStreamlitConnection,\n StreamlitComponentBase,\n Streamlit,\n} from \"./streamlit\"\nimport { Runtime, Inspector } from \"@observablehq/runtime\";\n\nclass Observable extends StreamlitComponentBase<{}> {\n public observeValue = {};\n private notebookRef = React.createRef<HTMLDivElement>();\n private runtime: any = null;\n private main: any = null;\n\n componentWillUnmount() {\n this.runtime?.dispose();\n }\n // @ts-ignore\n public componentDidUpdate(prevProps: any) {\n const { args: prevArgs } = prevProps;\n if (prevArgs.notebook !== this.props.args.notebook) {\n // TODO handle new notebook\n }\n this.redefineCells(this.main, this.props.args.redefine);\n }\n\n async embedNotebook(notebook: string, targets: string[], observe: string[], hide:string[]) {\n if (this.runtime) {\n this.runtime.dispose();\n }\n const targetSet = new Set(targets);\n const observeSet = new Set(observe);\n const hideSet = new Set(hide);\n this.runtime = new Runtime();\n const { default: define } = await eval(`import(\"https://api.observablehq.com/${notebook}.js?v=3\")`);\n this.main = this.runtime.module(define, (name: string) => {\n if (observeSet.has(name) && !targetSet.has(name)) {\n const observeValue = this.observeValue;\n return {\n fulfilled: (value: any) => {\n //@ts-ignore\n observeValue[name] = value;\n //@ts-ignore\n Streamlit.setComponentValue(observeValue);\n }\n }\n }\n if (targetSet.size > 0 && !targetSet.has(name)) return;\n if(hideSet.has(name)) return true;\n const el = document.createElement('div');\n this.notebookRef.current?.appendChild(el);\n\n const i = new Inspector(el);\n el.addEventListener('input', e => {\n Streamlit.setFrameHeight();\n })\n return {\n pending() {\n i.pending();\n Streamlit.setFrameHeight();\n },\n fulfilled(value: any) {\n i.fulfilled(value);\n Streamlit.setFrameHeight();\n },\n rejected(error: any) {\n i.rejected(error);\n Streamlit.setFrameHeight();\n },\n };\n });\n if (observeSet.size > 0) {\n Promise.all(Array.from(observeSet).map(async name => [name, await this.main.value(name)])).then(initial => {\n for (const [name, value] of initial) {\n // @ts-ignore\n this.observeValue[name] = value\n };\n Streamlit.setComponentValue(this.observeValue);\n })\n }\n }\n\n redefineCells(main: any, redefine = {}) {\n for (let cell in redefine) {\n //@ts-ignore\n main.redefine(cell, redefine[cell]);\n }\n }\n componentDidMount() {\n const { notebook, targets = [], observe = [], redefine = {} , hide=[]} = this.props.args;\n Streamlit.setComponentValue(this.observeValue);\n this.embedNotebook(notebook, targets, observe, hide).then(() => {\n this.redefineCells(this.main, redefine);\n });\n\n }\n\n public render = (): ReactNode => {\n return (\n <div style={{ border: '1px solid gray', borderRadius: '4px' }}>\n <div style={{ padding: '9px 12px' }}>\n <div ref={this.notebookRef}></div>\n </div>\n <div style={{ marginTop: '4px' }}>\n \n <div style={{\n backgroundColor: '#ddd',\n fontWeight: 700,\n padding: \".25rem .5rem\",\n borderRadius: '0 0 4px 4px',\n gridTemplateColumns: \"auto auto\",\n display:\"grid\"\n }}>\n <div style={{textAlign:\"left\"}}>{this.props.args.name}</div>\n <div style={{textAlign:\"right\"}}>\n <a href={`https://observablehq.com/${this.props.args.notebook}`} style={{ color: '#666', }}></a>\n </div>\n </div>\n </div>\n </div >\n )\n }\n}\n\nexport default withStreamlitConnection(Observable)\n","import React from \"react\"\nimport ReactDOM from \"react-dom\"\nimport Observable from \"./Observable\"\n\nReactDOM.render(\n <React.StrictMode>\n <Observable />\n </React.StrictMode>,\n document.getElementById(\"root\")\n)\n"],"sourceRoot":""}
 
 
streamlit_observable/frontend/build/static/js/runtime-main.11ec9aca.js DELETED
@@ -1,2 +0,0 @@
1
- !function(e){function t(t){for(var n,l,a=t[0],p=t[1],i=t[2],c=0,s=[];c<a.length;c++)l=a[c],Object.prototype.hasOwnProperty.call(o,l)&&o[l]&&s.push(o[l][0]),o[l]=0;for(n in p)Object.prototype.hasOwnProperty.call(p,n)&&(e[n]=p[n]);for(f&&f(t);s.length;)s.shift()();return u.push.apply(u,i||[]),r()}function r(){for(var e,t=0;t<u.length;t++){for(var r=u[t],n=!0,a=1;a<r.length;a++){var p=r[a];0!==o[p]&&(n=!1)}n&&(u.splice(t--,1),e=l(l.s=r[0]))}return e}var n={},o={1:0},u=[];function l(t){if(n[t])return n[t].exports;var r=n[t]={i:t,l:!1,exports:{}};return e[t].call(r.exports,r,r.exports,l),r.l=!0,r.exports}l.m=e,l.c=n,l.d=function(e,t,r){l.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:r})},l.r=function(e){"undefined"!==typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},l.t=function(e,t){if(1&t&&(e=l(e)),8&t)return e;if(4&t&&"object"===typeof e&&e&&e.__esModule)return e;var r=Object.create(null);if(l.r(r),Object.defineProperty(r,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var n in e)l.d(r,n,function(t){return e[t]}.bind(null,n));return r},l.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return l.d(t,"a",t),t},l.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},l.p="./";var a=this.webpackJsonpstreamlit_component_template=this.webpackJsonpstreamlit_component_template||[],p=a.push.bind(a);a.push=t,a=a.slice();for(var i=0;i<a.length;i++)t(a[i]);var f=p;r()}([]);
2
- //# sourceMappingURL=runtime-main.11ec9aca.js.map
 
 
 
streamlit_observable/frontend/build/static/js/runtime-main.11ec9aca.js.map DELETED
@@ -1 +0,0 @@
1
- {"version":3,"sources":["../webpack/bootstrap"],"names":["webpackJsonpCallback","data","moduleId","chunkId","chunkIds","moreModules","executeModules","i","resolves","length","Object","prototype","hasOwnProperty","call","installedChunks","push","modules","parentJsonpFunction","shift","deferredModules","apply","checkDeferredModules","result","deferredModule","fulfilled","j","depId","splice","__webpack_require__","s","installedModules","1","exports","module","l","m","c","d","name","getter","o","defineProperty","enumerable","get","r","Symbol","toStringTag","value","t","mode","__esModule","ns","create","key","bind","n","object","property","p","jsonpArray","this","oldJsonpFunction","slice"],"mappings":"aACE,SAASA,EAAqBC,GAQ7B,IAPA,IAMIC,EAAUC,EANVC,EAAWH,EAAK,GAChBI,EAAcJ,EAAK,GACnBK,EAAiBL,EAAK,GAIHM,EAAI,EAAGC,EAAW,GACpCD,EAAIH,EAASK,OAAQF,IACzBJ,EAAUC,EAASG,GAChBG,OAAOC,UAAUC,eAAeC,KAAKC,EAAiBX,IAAYW,EAAgBX,IACpFK,EAASO,KAAKD,EAAgBX,GAAS,IAExCW,EAAgBX,GAAW,EAE5B,IAAID,KAAYG,EACZK,OAAOC,UAAUC,eAAeC,KAAKR,EAAaH,KACpDc,EAAQd,GAAYG,EAAYH,IAKlC,IAFGe,GAAqBA,EAAoBhB,GAEtCO,EAASC,QACdD,EAASU,OAATV,GAOD,OAHAW,EAAgBJ,KAAKK,MAAMD,EAAiBb,GAAkB,IAGvDe,IAER,SAASA,IAER,IADA,IAAIC,EACIf,EAAI,EAAGA,EAAIY,EAAgBV,OAAQF,IAAK,CAG/C,IAFA,IAAIgB,EAAiBJ,EAAgBZ,GACjCiB,GAAY,EACRC,EAAI,EAAGA,EAAIF,EAAed,OAAQgB,IAAK,CAC9C,IAAIC,EAAQH,EAAeE,GACG,IAA3BX,EAAgBY,KAAcF,GAAY,GAE3CA,IACFL,EAAgBQ,OAAOpB,IAAK,GAC5Be,EAASM,EAAoBA,EAAoBC,EAAIN,EAAe,KAItE,OAAOD,EAIR,IAAIQ,EAAmB,GAKnBhB,EAAkB,CACrBiB,EAAG,GAGAZ,EAAkB,GAGtB,SAASS,EAAoB1B,GAG5B,GAAG4B,EAAiB5B,GACnB,OAAO4B,EAAiB5B,GAAU8B,QAGnC,IAAIC,EAASH,EAAiB5B,GAAY,CACzCK,EAAGL,EACHgC,GAAG,EACHF,QAAS,IAUV,OANAhB,EAAQd,GAAUW,KAAKoB,EAAOD,QAASC,EAAQA,EAAOD,QAASJ,GAG/DK,EAAOC,GAAI,EAGJD,EAAOD,QAKfJ,EAAoBO,EAAInB,EAGxBY,EAAoBQ,EAAIN,EAGxBF,EAAoBS,EAAI,SAASL,EAASM,EAAMC,GAC3CX,EAAoBY,EAAER,EAASM,IAClC5B,OAAO+B,eAAeT,EAASM,EAAM,CAAEI,YAAY,EAAMC,IAAKJ,KAKhEX,EAAoBgB,EAAI,SAASZ,GACX,qBAAXa,QAA0BA,OAAOC,aAC1CpC,OAAO+B,eAAeT,EAASa,OAAOC,YAAa,CAAEC,MAAO,WAE7DrC,OAAO+B,eAAeT,EAAS,aAAc,CAAEe,OAAO,KAQvDnB,EAAoBoB,EAAI,SAASD,EAAOE,GAEvC,GADU,EAAPA,IAAUF,EAAQnB,EAAoBmB,IAC/B,EAAPE,EAAU,OAAOF,EACpB,GAAW,EAAPE,GAA8B,kBAAVF,GAAsBA,GAASA,EAAMG,WAAY,OAAOH,EAChF,IAAII,EAAKzC,OAAO0C,OAAO,MAGvB,GAFAxB,EAAoBgB,EAAEO,GACtBzC,OAAO+B,eAAeU,EAAI,UAAW,CAAET,YAAY,EAAMK,MAAOA,IACtD,EAAPE,GAA4B,iBAATF,EAAmB,IAAI,IAAIM,KAAON,EAAOnB,EAAoBS,EAAEc,EAAIE,EAAK,SAASA,GAAO,OAAON,EAAMM,IAAQC,KAAK,KAAMD,IAC9I,OAAOF,GAIRvB,EAAoB2B,EAAI,SAAStB,GAChC,IAAIM,EAASN,GAAUA,EAAOiB,WAC7B,WAAwB,OAAOjB,EAAgB,SAC/C,WAA8B,OAAOA,GAEtC,OADAL,EAAoBS,EAAEE,EAAQ,IAAKA,GAC5BA,GAIRX,EAAoBY,EAAI,SAASgB,EAAQC,GAAY,OAAO/C,OAAOC,UAAUC,eAAeC,KAAK2C,EAAQC,IAGzG7B,EAAoB8B,EAAI,KAExB,IAAIC,EAAaC,KAA+C,yCAAIA,KAA+C,0CAAK,GACpHC,EAAmBF,EAAW5C,KAAKuC,KAAKK,GAC5CA,EAAW5C,KAAOf,EAClB2D,EAAaA,EAAWG,QACxB,IAAI,IAAIvD,EAAI,EAAGA,EAAIoD,EAAWlD,OAAQF,IAAKP,EAAqB2D,EAAWpD,IAC3E,IAAIU,EAAsB4C,EAI1BxC,I","file":"static/js/runtime-main.11ec9aca.js","sourcesContent":[" \t// install a JSONP callback for chunk loading\n \tfunction webpackJsonpCallback(data) {\n \t\tvar chunkIds = data[0];\n \t\tvar moreModules = data[1];\n \t\tvar executeModules = data[2];\n\n \t\t// add \"moreModules\" to the modules object,\n \t\t// then flag all \"chunkIds\" as loaded and fire callback\n \t\tvar moduleId, chunkId, i = 0, resolves = [];\n \t\tfor(;i < chunkIds.length; i++) {\n \t\t\tchunkId = chunkIds[i];\n \t\t\tif(Object.prototype.hasOwnProperty.call(installedChunks, chunkId) && installedChunks[chunkId]) {\n \t\t\t\tresolves.push(installedChunks[chunkId][0]);\n \t\t\t}\n \t\t\tinstalledChunks[chunkId] = 0;\n \t\t}\n \t\tfor(moduleId in moreModules) {\n \t\t\tif(Object.prototype.hasOwnProperty.call(moreModules, moduleId)) {\n \t\t\t\tmodules[moduleId] = moreModules[moduleId];\n \t\t\t}\n \t\t}\n \t\tif(parentJsonpFunction) parentJsonpFunction(data);\n\n \t\twhile(resolves.length) {\n \t\t\tresolves.shift()();\n \t\t}\n\n \t\t// add entry modules from loaded chunk to deferred list\n \t\tdeferredModules.push.apply(deferredModules, executeModules || []);\n\n \t\t// run deferred modules when all chunks ready\n \t\treturn checkDeferredModules();\n \t};\n \tfunction checkDeferredModules() {\n \t\tvar result;\n \t\tfor(var i = 0; i < deferredModules.length; i++) {\n \t\t\tvar deferredModule = deferredModules[i];\n \t\t\tvar fulfilled = true;\n \t\t\tfor(var j = 1; j < deferredModule.length; j++) {\n \t\t\t\tvar depId = deferredModule[j];\n \t\t\t\tif(installedChunks[depId] !== 0) fulfilled = false;\n \t\t\t}\n \t\t\tif(fulfilled) {\n \t\t\t\tdeferredModules.splice(i--, 1);\n \t\t\t\tresult = __webpack_require__(__webpack_require__.s = deferredModule[0]);\n \t\t\t}\n \t\t}\n\n \t\treturn result;\n \t}\n\n \t// The module cache\n \tvar installedModules = {};\n\n \t// object to store loaded and loading chunks\n \t// undefined = chunk not loaded, null = chunk preloaded/prefetched\n \t// Promise = chunk loading, 0 = chunk loaded\n \tvar installedChunks = {\n \t\t1: 0\n \t};\n\n \tvar deferredModules = [];\n\n \t// The require function\n \tfunction __webpack_require__(moduleId) {\n\n \t\t// Check if module is in cache\n \t\tif(installedModules[moduleId]) {\n \t\t\treturn installedModules[moduleId].exports;\n \t\t}\n \t\t// Create a new module (and put it into the cache)\n \t\tvar module = installedModules[moduleId] = {\n \t\t\ti: moduleId,\n \t\t\tl: false,\n \t\t\texports: {}\n \t\t};\n\n \t\t// Execute the module function\n \t\tmodules[moduleId].call(module.exports, module, module.exports, __webpack_require__);\n\n \t\t// Flag the module as loaded\n \t\tmodule.l = true;\n\n \t\t// Return the exports of the module\n \t\treturn module.exports;\n \t}\n\n\n \t// expose the modules object (__webpack_modules__)\n \t__webpack_require__.m = modules;\n\n \t// expose the module cache\n \t__webpack_require__.c = installedModules;\n\n \t// define getter function for harmony exports\n \t__webpack_require__.d = function(exports, name, getter) {\n \t\tif(!__webpack_require__.o(exports, name)) {\n \t\t\tObject.defineProperty(exports, name, { enumerable: true, get: getter });\n \t\t}\n \t};\n\n \t// define __esModule on exports\n \t__webpack_require__.r = function(exports) {\n \t\tif(typeof Symbol !== 'undefined' && Symbol.toStringTag) {\n \t\t\tObject.defineProperty(exports, Symbol.toStringTag, { value: 'Module' });\n \t\t}\n \t\tObject.defineProperty(exports, '__esModule', { value: true });\n \t};\n\n \t// create a fake namespace object\n \t// mode & 1: value is a module id, require it\n \t// mode & 2: merge all properties of value into the ns\n \t// mode & 4: return value when already ns object\n \t// mode & 8|1: behave like require\n \t__webpack_require__.t = function(value, mode) {\n \t\tif(mode & 1) value = __webpack_require__(value);\n \t\tif(mode & 8) return value;\n \t\tif((mode & 4) && typeof value === 'object' && value && value.__esModule) return value;\n \t\tvar ns = Object.create(null);\n \t\t__webpack_require__.r(ns);\n \t\tObject.defineProperty(ns, 'default', { enumerable: true, value: value });\n \t\tif(mode & 2 && typeof value != 'string') for(var key in value) __webpack_require__.d(ns, key, function(key) { return value[key]; }.bind(null, key));\n \t\treturn ns;\n \t};\n\n \t// getDefaultExport function for compatibility with non-harmony modules\n \t__webpack_require__.n = function(module) {\n \t\tvar getter = module && module.__esModule ?\n \t\t\tfunction getDefault() { return module['default']; } :\n \t\t\tfunction getModuleExports() { return module; };\n \t\t__webpack_require__.d(getter, 'a', getter);\n \t\treturn getter;\n \t};\n\n \t// Object.prototype.hasOwnProperty.call\n \t__webpack_require__.o = function(object, property) { return Object.prototype.hasOwnProperty.call(object, property); };\n\n \t// __webpack_public_path__\n \t__webpack_require__.p = \"./\";\n\n \tvar jsonpArray = this[\"webpackJsonpstreamlit_component_template\"] = this[\"webpackJsonpstreamlit_component_template\"] || [];\n \tvar oldJsonpFunction = jsonpArray.push.bind(jsonpArray);\n \tjsonpArray.push = webpackJsonpCallback;\n \tjsonpArray = jsonpArray.slice();\n \tfor(var i = 0; i < jsonpArray.length; i++) webpackJsonpCallback(jsonpArray[i]);\n \tvar parentJsonpFunction = oldJsonpFunction;\n\n\n \t// run deferred modules from other chunks\n \tcheckDeferredModules();\n"],"sourceRoot":""}