PeppoCola commited on
Commit
c8c7e15
1 Parent(s): 5fc72d5

Upload folder using huggingface_hub

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,251 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: setfit
3
+ metrics:
4
+ - accuracy
5
+ pipeline_tag: text-classification
6
+ tags:
7
+ - setfit
8
+ - sentence-transformers
9
+ - text-classification
10
+ - generated_from_setfit_trainer
11
+ widget:
12
+ - text: "Throttle send frame EVR\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**|commit\
13
+ \ d3fa31c |\r\n|**_Affected Component_**| ? |\r\n---\r\n## Problem Description\r\
14
+ \n\r\nA description of the problem with sufficient detail to understand the issue.\r\
15
+ \n\r\nIf there is no ground system, the interface continuously sends this pair\
16
+ \ of EVRs:\r\n\r\n```\r\n0x201db690 (TV_TLM): [ERROR] Failed to send framed data:\
17
+ \ 0\r\n0x201db690 (TV_TLM): [ERROR] Failed to send framed data: 0\r\n0x201db690\
18
+ \ (TV_TLM): [ERROR] Failed to send framed data: 0\r\n0x201db690 (TV_TLM): [ERROR]\
19
+ \ Failed to send framed data: 0\r\n0x201db690 (TV_TLM): [ERROR] Failed to send\
20
+ \ framed data: 0\r\n0x201db690 (TV_TLM): [ERROR] Failed to send framed data: 0\r\
21
+ \n0x202236f0 (TV_ReceiveTask): [WARNING] Failed to open port with status 61 and\
22
+ \ errno 0\r\n```\r\n\r\n## How to Reproduce\r\n\r\n1. Run Ref without the ground\
23
+ \ system\r\n2.\r\n3.\r\n\r\n## Expected Behavior\r\n\r\nIMHO the EVR should throttle,\
24
+ \ and perhaps the throttle is reset when the connection is make.\r\n"
25
+ - text: "Color-coding interlaced Events in the API Log\n| | |\r\n|:---|:---|\r\n|**_F´\
26
+ \ Version_**| |\r\n|**_Affected Component_**| |\r\n---\r\nOne feature that wasn't\
27
+ \ completed this summer was to color-code interlaced event logs based on severity.\
28
+ \ Presently, interlacing events are implemented by making the API a consumer of\
29
+ \ the event decoder in the GDS and then filtering events. Modifying the color\
30
+ \ of these log messages can be done [here](https://github.com/nasa/fprime/blob/717bc6fab85c53680108fc961cad6338e779816f/Gds/src/fprime_gds/common/testing_fw/api.py#L1258).\r\
31
+ \n"
32
+ - text: "Switch Framer and Deframer to use Mallocator Pattern\n| | |\r\n|:---|:---|\r\
33
+ \n|**_F´ Version_**| |\r\n|**_Affected Component_**| |\r\n---\r\n## Problem\
34
+ \ Description\r\n\r\nMallocator pattern is preferred over member-allocated buffers."
35
+ - text: "Ninja support for fprime-tools\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**|\
36
+ \ |\r\n|**_Affected Component_**| |\r\n---\r\n## Problem Description\r\n\r\n\
37
+ There are a couple places in fprime-tools where things would break if Ninja was\
38
+ \ used instead of Make. We need to fix that, as Ninja is usually much faster.\r\
39
+ \ne.g. [this](https://github.com/fprime-community/fprime-tools/blob/0a9fdf58ce4b428d407ab264f7266041808237c8/src/fprime/fbuild/cmake.py#L133)\
40
+ \ is Make-specific output, Ninja formats it differently\r\n\r\n## Expected Behavior\r\
41
+ \n\r\nSupport Ninja with fprime-tools. Add a convenient option to chose which\
42
+ \ one to use.\r\n"
43
+ - text: "Build A Frame Reassembler\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**| |\r\
44
+ \n|**_Affected Component_**| |\r\n---\r\n## Feature Description\r\n\r\nBuild\
45
+ \ a component that can be used to reassemble communication frames given protocol\
46
+ \ information. This will break-off this functionality from the Deframer.\r\n\r\
47
+ \nBasic requirements:\r\n1. Accept incoming Fw::Buffers of any size\r\n2. Accumulate\
48
+ \ buffers in-order\r\n3. Call frame detector helper class\r\n4. On \"NO_FRAME\"\
49
+ \ discard first byte and try again\r\n5. On \"NEED DATA\" continue to accumulate\
50
+ \ data\r\n6. On \"FRAME\" allocate buffer, copy-out frame\r\n\r\nHelper class\
51
+ \ requirements:\r\n1. Must implement `Enum detect_frame(const CircularBuffer&\
52
+ \ buffer, FwSizeType& size_output)` method\r\n2. Cannot alter circular buffer\
53
+ \ (uses peeks)\r\n3. Must set `size_output` when data is needed and when frame\
54
+ \ detected\r\n "
55
+ inference: true
56
+ ---
57
+
58
+ # SetFit
59
+
60
+ This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
61
+
62
+ The model has been trained using an efficient few-shot learning technique that involves:
63
+
64
+ 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
65
+ 2. Training a classification head with features from the fine-tuned Sentence Transformer.
66
+
67
+ ## Model Details
68
+
69
+ ### Model Description
70
+ - **Model Type:** SetFit
71
+ <!-- - **Sentence Transformer:** [Unknown](https://huggingface.co/unknown) -->
72
+ - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
73
+ - **Maximum Sequence Length:** 384 tokens
74
+ - **Number of Classes:** 2 classes
75
+ <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
76
+ <!-- - **Language:** Unknown -->
77
+ <!-- - **License:** Unknown -->
78
+
79
+ ### Model Sources
80
+
81
+ - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
82
+ - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
83
+ - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
84
+
85
+ ### Model Labels
86
+ | Label | Examples |
87
+ |:--------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
88
+ | bug | <ul><li>"A global-buffer-overflow error in BufferQueueTest.cpp line 126\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**|1.5 |\r\n|**_Affected Component_**| Os/Pthreads |\r\n---\r\n## Problem Description\r\n\r\nA global-buffer-overflow error in [BufferQueueTest.cpp line 126](https://github.com/nasa/fprime/blob/ada6e424b813e6434284bbde9f785656a61c8539/Os/Pthreads/test/ut/BufferQueueTest.cpp#L126)\r\n\r\n```\r\nzyh@virtual:~/fprime$ ./build-fprime-automatic-native-ut/bin/Linux/Os_pthreads\r\nCreating queue.\r\nTest empty queue...\r\nPassed.\r\nTest full queue...\r\nPushing 0.\r\nPushing 1.\r\nPushing 2.\r\nPushing 3.\r\nPushing 4.\r\nPassed.\r\nTest weird size...\r\nPassed.\r\nTest pop...\r\nPopping 5.\r\nPopping 4.\r\nPopping 3.\r\nPopping 2.\r\nPopping 1.\r\nPassed.\r\nTest priorities...\r\nPushing 'hello' at priority 9.\r\nPushing 'how are you' at priority 4.\r\nPushing 'pretty good' at priority 100.\r\nPushing 'cosmic bro' at priority 4.\r\nPushing 'kthxbye' at priority 9.\r\nPopped 'pretty good' at priority 100. Expected 'pretty good' at priority 100.\r\nPopped 'hello' at priority 9. Expected 'hello' at priority 9.\r\n=================================================================\r\n==17078==ERROR: AddressSanitizer: global-buffer-overflow on address 0x000000550bc6 at pc 0x0000004b3f6b bp 0x7ffd804c4c90 sp 0x7ffd804c4440\r\nREAD of size 8 at 0x000000550bc6 thread T0\r\n #0 0x4b3f6a in __interceptor_memcmp.part.283 (/home/zyh/fprime/build-fprime-automatic-native-ut/bin/Linux/Os_pthreads+0x4b3f6a)\r\n #1 0x51d718 in main /home/zyh/fprime/Os/Pthreads/test/ut/BufferQueueTest.cpp:126:5\r\n #2 0x7f976cdadbf6 in __libc_start_main /build/glibc-S9d2JN/glibc-2.27/csu/../csu/libc-start.c:310\r\n #3 0x41ab89 in _start (/home/zyh/fprime/build-fprime-automatic-native-ut/bin/Linux/Os_pthreads+0x41ab89)\r\n\r\n0x000000550bc6 is located 58 bytes to the left of global variable '<string literal>' defined in '/home/zyh/fprime/Os/Pthreads/test/ut/BufferQueueTest.cpp:98:43' (0x550c00) of size 12\r\n '<string literal>' is ascii string 'how are you'\r\n0x000000550bc6 is located 0 bytes to the right of global variable '<string literal>' defined in '/home/zyh/fprime/Os/Pthreads/test/ut/BufferQueueTest.cpp:98:34' (0x550bc0) of size 6\r\n '<string literal>' is ascii string 'hello'\r\nSUMMARY: AddressSanitizer: global-buffer-overflow (/home/zyh/fprime/build-fprime-automatic-native-ut/bin/Linux/Os_pthreads+0x4b3f6a) in __interceptor_memcmp.part.283\r\nShadow bytes around the buggy address:\r\n 0x0000800a2120: f9 f9 f9 f9 00 00 00 00 00 00 00 01 f9 f9 f9 f9\r\n 0x0000800a2130: 00 00 05 f9 f9 f9 f9 f9 00 01 f9 f9 f9 f9 f9 f9\r\n 0x0000800a2140: 00 00 04 f9 f9 f9 f9 f9 00 05 f9 f9 f9 f9 f9 f9\r\n 0x0000800a2150: 00 00 04 f9 f9 f9 f9 f9 00 05 f9 f9 f9 f9 f9 f9\r\n 0x0000800a2160: 00 05 f9 f9 f9 f9 f9 f9 00 00 04 f9 f9 f9 f9 f9\r\n=>0x0000800a2170: 00 00 04 f9 f9 f9 f9 f9[06]f9 f9 f9 f9 f9 f9 f9\r\n 0x0000800a2180: 00 04 f9 f9 f9 f9 f9 f9 00 04 f9 f9 f9 f9 f9 f9\r\n 0x0000800a2190: 00 03 f9 f9 f9 f9 f9 f9 00 f9 f9 f9 f9 f9 f9 f9\r\n 0x0000800a21a0: 00 00 00 06 f9 f9 f9 f9 00 00 04 f9 f9 f9 f9 f9\r\n 0x0000800a21b0: 00 00 00 00 00 00 00 03 f9 f9 f9 f9 00 04 f9 f9\r\n 0x0000800a21c0: f9 f9 f9 f9 00 00 00 00 00 00 00 00 00 00 00 00\r\nShadow byte legend (one shadow byte represents 8 application bytes):\r\n Addressable: 00\r\n Partially addressable: 01 02 03 04 05 06 07 \r\n Heap left redzone: fa\r\n Freed heap region: fd\r\n Stack left redzone: f1\r\n Stack mid redzone: f2\r\n Stack right redzone: f3\r\n Stack after return: f5\r\n Stack use after scope: f8\r\n Global redzone: f9\r\n Global init order: f6\r\n Poisoned by user: f7\r\n Container overflow: fc\r\n Array cookie: ac\r\n Intra object redzone: bb\r\n ASan internal: fe\r\n Left alloca redzone: ca\r\n Right alloca redzone: cb\r\n==17078==ABORTING\r\n```\r\n\r\n## How to Reproduce\r\n\r\n1. Compile unit test with ASAN\r\n2. Run build-fprime-automatic-native-ut/bin/Linux/Os_pthreads\r\n3. Program abort\r\n\r\n## Expected Behavior\r\n\r\nRun normally with ASAN.\r\n"</li><li>'toString doesn\'t display the last item in a struct\n| | |\r\n|:---|:---|\r\n|**F` version v3.1.1**| |\r\n|****| |\r\n---\r\n## Problem Description\r\n\r\nFiles generated by fpp tools doesn\'t produce the format string correctly and so the last member of a string doesn\'t show up as expected. For example \r\n\r\nThe take the following struct defined in fpp:\r\n```fpp\r\n enum CMD_STATUS : U16 {\r\n NA = 0,\r\n ACK = 0x5,\r\n NOT_ACK = 0x6,\r\n BUSY = 0x7,\r\n NCE = 0x8,\r\n STACK_FULL = 0x9,\r\n TEMP_NOT_ACC = 0x10\r\n };\r\n\r\n constant ESUP_HEADER_ID = 0x50555345\r\n\r\n# NOTE that in byte form it will be represented with LE\r\n struct EsupPacketHeader {\r\n HeaderId: U32\r\n ModuleId: U16\r\n DataLength: U16\r\n CmdStatus: CMD_STATUS\r\n } default {HeaderId = ESUP_HEADER_ID, CmdStatus=CMD_STATUS.NA}\r\n\r\n\r\n struct EsupStatusGet {\r\n Header: EsupPacketHeader\r\n CmdId: EsupCmdId\r\n TypeId: U16\r\n SystemState: U8 @< 1,2,3,4 (1 byte unsigned char value)\r\n StatusFlags: U8 @< 1 byte unsigned char value\r\n Reserved: U16 @< 2 byte unsigned short value\r\n CpuTemperature: F32 @< -40 to 125 °C (4 byte float value)\r\n FirmwareVersion: U32 @< firmware version (4 byte unsigned int value)\r\n } default {CmdId = EsupCmdId.ConfGet, TypeId = EsupCmdType.StatusRep}\r\n\r\n```\r\n\r\nI then go to print out the contents like so:\r\n\r\n```cpp\r\nstatic bool receiveEsupStatusResponse(int serialPort, EsupCmdId cmdId, boost::span<BYTE, GET_PADDED_SIZE(EsupStatusGet::SERIALIZED_SIZE)> txBuff) {\r\n // Wait for a reply\r\n EsupStatusGet cmdResult;\r\n\r\n size_t bytesRead = read(serialPort, txBuff.data(), txBuff.size());\r\n if (bytesRead > 0) {\r\n FW_CHECK(bytesRead <= txBuff.size(), "Error deserializing response", return false; );\r\n LE_ExternalDeSerializeBuffer responseBuff(txBuff.data(), bytesRead);\r\n cmdResult.deserialize(responseBuff);\r\n Fw::String ackString;\r\n cmdResult.toString(ackString);\r\n std::cout << std::string(ackString.toChar()) << std::endl;\r\n } else {\r\n std::cout << "No reply received." << std::endl;\r\n }\r\n\r\n return true;\r\n }\r\n```\r\n\r\nI get the following:\r\n\r\n```bash\r\n(Header = (HeaderId = 1347769157, ModuleId = 8203, DataLength = 0, CmdStatus = ), CmdId = ConfGet, TypeId = 0, SystemState = 162, StatusFlags = 187, Reserved = 53517, CpuTemperature = 0, FirmwareVersion = )\r\n```\r\n\r\nWhere I would normally expect to see something like this (the actual values don\'t matter for these purposes just that there is something to print):\r\n\r\n```bash\r\n(Header = (HeaderId = 1347769157, ModuleId = 8203, DataLength = 0, CmdStatus = ACK ), CmdId = ConfGet, TypeId = 0, SystemState = 162, StatusFlags = 187, Reserved = 53517, CpuTemperature = 0, FirmwareVersion = 102444)\r\n```\r\n\r\nI\'ve found this is consistent across pretty much all the fpp serializable objects and it really comes down to the last member of the struct doesn\'t have a format string generated along with it\r\n\r\nSo for example we see with the header type the following gets generated:\r\n\r\n```c++\r\nvoid EsupPacketHeader::toString(Fw::StringBase& text) const {\r\n\r\n static const char * formatString =\r\n "("\r\n "HeaderId = %u, "\r\n "ModuleId = %u, "\r\n "DataLength = %u, "\r\n "CmdStatus = "\r\n ")";\r\n\r\n // declare strings to hold any serializable toString() arguments\r\n\r\n\r\n Fw::String CmdStatusStr;\r\n this->m_CmdStatus.toString(CmdStatusStr);\r\n\r\n char outputString[FW_SERIALIZABLE_TO_STRING_BUFFER_SIZE];\r\n (void)snprintf(outputString,FW_SERIALIZABLE_TO_STRING_BUFFER_SIZE,formatString\r\n ,this->m_HeaderId\r\n ,this->m_ModuleId\r\n ,this->m_DataLength\r\n ,CmdStatusStr.toChar()\r\n );\r\n outputString[FW_SERIALIZABLE_TO_STRING_BUFFER_SIZE-1] = 0; // NULL terminate\r\n\r\n text = outputString;\r\n}\r\n```\r\n\r\nWith the CmdStatus missing the "```%s```".\r\n\r\nI\'m not really sure what the source of the error is since the xml seems to be fine:\r\n\r\n```xml\r\n<serializable namespace="FlightComputer" name="EsupPacketHeader">\r\n <import_enum_type>FlightComputer/TransmitterInterface/CMD_STATUSEnumAi.xml</import_enum_type>\r\n <members>\r\n <member name="HeaderId" type="U32" format="%u">\r\n <default>1347769157</default>\r\n </member>\r\n <member name="ModuleId" type="U16" format="%u">\r\n <default>0</default>\r\n </member>\r\n <member name="DataLength" type="U16" format="%u">\r\n <default>0</default>\r\n </member>\r\n <member name="CmdStatus" type="FlightComputer::CMD_STATUS" format="%s">\r\n <default>FlightComputer::CMD_STATUS::NA</default>\r\n </member>\r\n </members>\r\n</serializable>\r\n```\r\nI took a look in array_cpp.py and array_cpp.tmpl but couldn\'t make sense of the issue there.\r\n\r\nIf I can get some help on this that\'d be much appreciated.'</li><li>'Single quote in project parent folder causes installation errors\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**| v3.4.3 |\r\n|**_Affected Component_**| Installation/Project Setup |\r\n---\r\n## Problem Description\r\n\r\nIf you name the parent directory of your project such that it has a single quote in it, you\'ll run into an error when you try to run \'fprime-util generate\' during basic project setup. \r\n\r\n## Context / Environment\r\n\r\n```\r\nOperating System: Linux\r\nCPU Architecture: x86_64\r\nPlatform: Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35\r\nPython version: 3.10.12\r\nCMake version: 3.22.1\r\nPip version: 24.0\r\nPip packages:\r\n fprime-tools==3.4.4\r\n fprime-gds==3.4.3\r\n fprime-fpp-*==2.1.0a3\r\n```\r\n\r\n## How to Reproduce\r\n\r\n1. Create a parent directory containing a single quote (For instance, "F\'")\r\n2. Follow the basic project setup guide located here: https://fprime-community.github.io/fprime-tutorial-hello-world/docs/NewProject.html \r\n3. Once you hit \'fprime-util generate\', you should soon see an error about an unterminated quote from one of the fprime dependencies. \r\n\r\n## Expected Behavior\r\n\r\nIdeally, you\'d have a project set up properly in your virtual environment.\r\n'</li></ul> |
89
+ | non-bug | <ul><li>"Install Upgraded PIP in new Project\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**| |\r\n|**_Affected Component_**| |\r\n---\r\n## Feature Description\r\n\r\nOlder versions of PIP may pull in non-native tools package. We should upgrade PIP during the new project setup. Note: users who don't want this can still decline venv setup entirely."</li><li>'fprime-gds: default file downlink directory can overwrite files\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**|[v3.4.3](https://github.com/fprime-community/fprime-gds/releases/tag/v3.4.3) |\r\n|**_Affected Component_**| n/a |\r\n---\r\n## Feature Description\r\n\r\nHave GDS created a time-tag directory for file downlinks.\r\n\r\nMake the default be `logs/<time tag>/fprime-downlink` to match telemetry and events.\r\n\r\nAlso, make a separate argument for uplink and downlink. It makes sense to automatically separate downlinked files, but uplink may wish to be in a common store.\r\n\r\n## Rationale\r\n\r\nThe default `/tmp/username` (or even overridden with `--file-storage-directory`) directory for file downlinks can cause new instances to overwrite older files, or cause parallel instances to clobber each other.\r\n'</li><li>'Update FileDownlink to report file progress\n| | |\r\n|:---|:---|\r\n|**_F´ Version_**|v3.4.3 |\r\n|**_Affected Component_**| `Svc/FileDownlink` |\r\n---\r\n## Feature Description\r\n\r\n`Svc/FileDownlink` currently only reports completion status back to the sender only when the file is complete. This will make re-transmitting data products inefficient since the Data Catalog has no idea how far partial transmits made it before the link dropped or the system was powered down. \r\n\r\nThis feature would report partial progress back to DP Catalog can restart transmits.\r\n\r\n## Rationale\r\n\r\nMake data product downlink management more efficient.\r\n'</li></ul> |
90
+
91
+ ## Uses
92
+
93
+ ### Direct Use for Inference
94
+
95
+ First install the SetFit library:
96
+
97
+ ```bash
98
+ pip install setfit
99
+ ```
100
+
101
+ Then you can load this model and run inference.
102
+
103
+ ```python
104
+ from setfit import SetFitModel
105
+
106
+ # Download from the 🤗 Hub
107
+ model = SetFitModel.from_pretrained("setfit_model_id")
108
+ # Run inference
109
+ preds = model("Switch Framer and Deframer to use Mallocator Pattern
110
+ | | |
111
+ |:---|:---|
112
+ |**_F´ Version_**| |
113
+ |**_Affected Component_**| |
114
+ ---
115
+ ## Problem Description
116
+
117
+ Mallocator pattern is preferred over member-allocated buffers.")
118
+ ```
119
+
120
+ <!--
121
+ ### Downstream Use
122
+
123
+ *List how someone could finetune this model on their own dataset.*
124
+ -->
125
+
126
+ <!--
127
+ ### Out-of-Scope Use
128
+
129
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
130
+ -->
131
+
132
+ <!--
133
+ ## Bias, Risks and Limitations
134
+
135
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
136
+ -->
137
+
138
+ <!--
139
+ ### Recommendations
140
+
141
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
142
+ -->
143
+
144
+ ## Training Details
145
+
146
+ ### Training Set Metrics
147
+ | Training set | Min | Median | Max |
148
+ |:-------------|:----|:---------|:-----|
149
+ | Word count | 4 | 124.1383 | 2486 |
150
+
151
+ | Label | Training Sample Count |
152
+ |:--------|:----------------------|
153
+ | bug | 296 |
154
+ | non-bug | 304 |
155
+
156
+ ### Training Hyperparameters
157
+ - batch_size: (16, 2)
158
+ - num_epochs: (1, 1)
159
+ - max_steps: -1
160
+ - sampling_strategy: oversampling
161
+ - num_iterations: 20
162
+ - body_learning_rate: (2e-05, 1e-05)
163
+ - head_learning_rate: 0.01
164
+ - loss: CosineSimilarityLoss
165
+ - distance_metric: cosine_distance
166
+ - margin: 0.25
167
+ - end_to_end: False
168
+ - use_amp: False
169
+ - warmup_proportion: 0.1
170
+ - l2_weight: 0.01
171
+ - seed: 42
172
+ - eval_max_steps: -1
173
+ - load_best_model_at_end: False
174
+
175
+ ### Training Results
176
+ | Epoch | Step | Training Loss | Validation Loss |
177
+ |:------:|:----:|:-------------:|:---------------:|
178
+ | 0.0007 | 1 | 0.447 | - |
179
+ | 0.0333 | 50 | 0.2333 | - |
180
+ | 0.0667 | 100 | 0.083 | - |
181
+ | 0.1 | 150 | 0.039 | - |
182
+ | 0.1333 | 200 | 0.0354 | - |
183
+ | 0.1667 | 250 | 0.0177 | - |
184
+ | 0.2 | 300 | 0.0053 | - |
185
+ | 0.2333 | 350 | 0.0004 | - |
186
+ | 0.2667 | 400 | 0.0027 | - |
187
+ | 0.3 | 450 | 0.0015 | - |
188
+ | 0.3333 | 500 | 0.002 | - |
189
+ | 0.3667 | 550 | 0.0003 | - |
190
+ | 0.4 | 600 | 0.0001 | - |
191
+ | 0.4333 | 650 | 0.0001 | - |
192
+ | 0.4667 | 700 | 0.0001 | - |
193
+ | 0.5 | 750 | 0.0001 | - |
194
+ | 0.5333 | 800 | 0.0001 | - |
195
+ | 0.5667 | 850 | 0.0001 | - |
196
+ | 0.6 | 900 | 0.0001 | - |
197
+ | 0.6333 | 950 | 0.0001 | - |
198
+ | 0.6667 | 1000 | 0.0001 | - |
199
+ | 0.7 | 1050 | 0.0 | - |
200
+ | 0.7333 | 1100 | 0.0 | - |
201
+ | 0.7667 | 1150 | 0.0001 | - |
202
+ | 0.8 | 1200 | 0.0 | - |
203
+ | 0.8333 | 1250 | 0.0001 | - |
204
+ | 0.8667 | 1300 | 0.0 | - |
205
+ | 0.9 | 1350 | 0.0 | - |
206
+ | 0.9333 | 1400 | 0.0001 | - |
207
+ | 0.9667 | 1450 | 0.0 | - |
208
+ | 1.0 | 1500 | 0.0 | - |
209
+
210
+ ### Framework Versions
211
+ - Python: 3.11.6
212
+ - SetFit: 1.1.0
213
+ - Sentence Transformers: 3.0.1
214
+ - Transformers: 4.44.2
215
+ - PyTorch: 2.4.1+cu121
216
+ - Datasets: 2.21.0
217
+ - Tokenizers: 0.19.1
218
+
219
+ ## Citation
220
+
221
+ ### BibTeX
222
+ ```bibtex
223
+ @article{https://doi.org/10.48550/arxiv.2209.11055,
224
+ doi = {10.48550/ARXIV.2209.11055},
225
+ url = {https://arxiv.org/abs/2209.11055},
226
+ author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
227
+ keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
228
+ title = {Efficient Few-Shot Learning Without Prompts},
229
+ publisher = {arXiv},
230
+ year = {2022},
231
+ copyright = {Creative Commons Attribution 4.0 International}
232
+ }
233
+ ```
234
+
235
+ <!--
236
+ ## Glossary
237
+
238
+ *Clearly define terms in order to be accessible across audiences.*
239
+ -->
240
+
241
+ <!--
242
+ ## Model Card Authors
243
+
244
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
245
+ -->
246
+
247
+ <!--
248
+ ## Model Card Contact
249
+
250
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
251
+ -->
config.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "sentence-transformers/all-mpnet-base-v2",
3
+ "architectures": [
4
+ "MPNetModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 514,
16
+ "model_type": "mpnet",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 1,
20
+ "relative_attention_num_buckets": 32,
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.44.2",
23
+ "vocab_size": 30527
24
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.0.1",
4
+ "transformers": "4.44.2",
5
+ "pytorch": "2.4.1+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": null
10
+ }
config_setfit.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "normalize_embeddings": false,
3
+ "labels": [
4
+ "bug",
5
+ "non-bug"
6
+ ]
7
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:861c28586b92f009017f1caf26f4a8b389a98ccff10ce2d1cd28d508ec3946d4
3
+ size 437967672
model_head.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e9b8857ebcd1875a7797af729249a61e1a005635ca534342e3886ad6ad27f319
3
+ size 7055
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 384,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "[UNK]",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "104": {
36
+ "content": "[UNK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "30526": {
44
+ "content": "<mask>",
45
+ "lstrip": true,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ }
51
+ },
52
+ "bos_token": "<s>",
53
+ "clean_up_tokenization_spaces": true,
54
+ "cls_token": "<s>",
55
+ "do_lower_case": true,
56
+ "eos_token": "</s>",
57
+ "mask_token": "<mask>",
58
+ "max_length": 128,
59
+ "model_max_length": 384,
60
+ "pad_to_multiple_of": null,
61
+ "pad_token": "<pad>",
62
+ "pad_token_type_id": 0,
63
+ "padding_side": "right",
64
+ "sep_token": "</s>",
65
+ "stride": 0,
66
+ "strip_accents": null,
67
+ "tokenize_chinese_chars": true,
68
+ "tokenizer_class": "MPNetTokenizer",
69
+ "truncation_side": "right",
70
+ "truncation_strategy": "longest_first",
71
+ "unk_token": "[UNK]"
72
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff