Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 6 new columns ({'Message', 'Git Diff', 'Hash', 'Author', 'IsMerge', 'Date'}) and 8 missing columns ({'is_merge', 'commit_message', 'masked_commit_message', 'author', 'type', 'date', 'git_diff', 'hash'}).

This happened while the csv dataset builder was generating data using

hf://datasets/rsh-raj/union-commits/union/train.csv (at revision 5c28cc1181c848daa9a31862554bf89441582c6d)

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1871, in _prepare_split_single
                  writer.write_table(table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 623, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2293, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2241, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              Hash: string
              Date: string
              Author: string
              Message: string
              IsMerge: bool
              Git Diff: string
              -- schema metadata --
              pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 929
              to
              {'hash': Value(dtype='string', id=None), 'date': Value(dtype='string', id=None), 'author': Value(dtype='string', id=None), 'commit_message': Value(dtype='string', id=None), 'is_merge': Value(dtype='bool', id=None), 'git_diff': Value(dtype='string', id=None), 'type': Value(dtype='string', id=None), 'masked_commit_message': Value(dtype='string', id=None)}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1438, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1050, in convert_to_parquet
                  builder.download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 925, in download_and_prepare
                  self._download_and_prepare(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1001, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1742, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1873, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 6 new columns ({'Message', 'Git Diff', 'Hash', 'Author', 'IsMerge', 'Date'}) and 8 missing columns ({'is_merge', 'commit_message', 'masked_commit_message', 'author', 'type', 'date', 'git_diff', 'hash'}).
              
              This happened while the csv dataset builder was generating data using
              
              hf://datasets/rsh-raj/union-commits/union/train.csv (at revision 5c28cc1181c848daa9a31862554bf89441582c6d)
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

hash
string
date
string
author
string
commit_message
string
is_merge
bool
git_diff
string
type
string
masked_commit_message
string
557c5bd108f8e9a2710f4053b8751034f01d0437
2025-03-07 06:20:08
PoisonPhang
chore: update v1.0.0 input
false
diff --git a/flake.lock b/flake.lock index 13ca43847e..18895a3a6c 100644 --- a/flake.lock +++ b/flake.lock @@ -760,11 +760,11 @@ "v1_0_0": { "flake": false, "locked": { - "lastModified": 1741296338, - "narHash": "sha256-awb25PsEcAaEOYBL5gQM48uda4Ybj12L6I2+cFb1uq4=", + "lastModified": 1741308140, + "narHash": "sha256-VT5/YMFCMjqew4urKoxm1sswc1KBsQQhF4ABoI4itVo=", "owner": "unionlabs", "repo": "union", - "rev": "46dd3042c98ed7ce773420c561c04e71422d6508", + "rev": "9a66a5ef4ee2498fcd8de7d66b6a31b4d947ba22", "type": "github" }, "original": {
chore
update v1.0.0 input
c2ba9f880ae2481fdb0bc1c4ca63d72303bbfe0f
2024-04-18 14:20:15
cor
feat(site): add polygon blogpost
false
diff --git a/site/src/content/blog/polygon-integration.mdx b/site/src/content/blog/polygon-integration.mdx new file mode 100644 index 0000000000..7ca395f863 --- /dev/null +++ b/site/src/content/blog/polygon-integration.mdx @@ -0,0 +1,39 @@ +--- +title: "Connecting Polygon to IBC by Unifying the AggLayer" +date: 2024-04-18 +author: "union_build" +description: "Union, the Modular Interoperability Layer, is plugging into the AggLayer, developed by leading L2 developer Polygon Labs, to facilitate message passing and asset transfers between the Polygon ecosystem and IBC-enabled chains. This integration will position Union as the de-facto IBC gateway in and out of the AggLayer, unlocking liquidity flow between two of the largest blockchain ecosystems: Polygon and Cosmos." +hidden: false +cover: "/src/assets/images/noble-and-movement-partnership/cover.png" +--- + +Union, the Modular Interoperability Layer, is plugging into the AggLayer, developed by leading L2 developer Polygon Labs, to facilitate message passing and asset transfers between the Polygon ecosystem and IBC-enabled chains. This integration will position Union as the de-facto IBC gateway in and out of the AggLayer, unlocking liquidity flow between two of the largest blockchain ecosystems: Polygon and Cosmos. + +The AggLayer was designed to solve the scalability problem in blockchain, which refers to scaling access to shared state and liquidity across multiple chains. To address this requires a new approach to blockchain architecture, namely, aggregated blockchains. Polygon Labs researchers and engineers have designed a solution – the aggregation layer, or AggLayer – which will seamlessly connect any ZK-enabled L2 or L1 chain, regardless of execution environment, for near-instant cross-chain transactions, and shared state and liquidity across chains. + +This includes chains leveraging Polygon Chain Development Kit (CDK), a modular, open source software toolkit for blockchain developers which supports the installation and configuration of a variety of chain architectures. Polygon CDK empowers developers to launch new L2 chains running Polygon zkEVM technology on Ethereum or, in the future, transition existing Layer 1 (L1) chains into zkEVM L2s. + + +The engineering team at Union and the research team at Polygon Labs will integrate Union's ZK proofs for IBC into the AggLayer. This integration aims to facilitate the seamless bridging of funds from any IBC-enabled chain to any blockchain connected to the AggLayer, ensuring a trustless, efficient, and low-latency process. + + +IMAGE HERE + +Immediate access to the widely-utilized IBC bridging protocol will provide numerous advantages for blockchains connected to the AggLayer, including: + +- Seamless access to liquidity without vendor lock-in +- Freedom for technical innovation through open standards +- Unrestricted liquidity movement and composability with other Web3 protocols +- Singular connection for interoperability with a basically unlimited number of chains + + +Union's two-way interoperability solution is also set to promote the smooth flow of liquidity and reduce user fragmentation between two modular ecosystems. Union’s collaborations within the Celestia modular stack will connect the AggLayer with the Celestia ecosystem, enabling the free flow of native asset liquidity across these networks and further enhancing the utility of native assets within the Polygon ecosystem. + + +Union Founder, Karel Kubat, said: + + +> _“The integration between Union and the AggLayer, developed by Polygon Labs, extends beyond a physical integration; it is a testament to both projects’ commitment to addressing the siloed nature of blockchains while upholding sovereignty. A secure link between AggLayer-connected blockchains and IBC-enabled chains will greatly benefit projects in both ecosystems and developers looking for the freedom afforded by the modular design and zk-proof bridging technology.”_ + +_“Union’s integration into the AggLayer is a significant milestone for both the Polygon and Cosmos ecosystems leveraging the AggLayer’s unique architecture to unlock modular and monolithic benefits for both ecosystems,”_ said Marc Boiron, CEO of Polygon Labs. _“It streamlines liquidity flow for current projects and users while catalyzing innovation.”_ +
feat
add polygon blogpost
77557fdc8a62a880b27f01498f8c3de00415547d
2023-12-14 01:47:12
Connor Davis
fix(ci): rename `uniond` bin in release workflow (#1040) * fix(ci): rename uniond output bin * chore(ci): remove debug
false
diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index 0f56464436..c4a14bffd3 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -71,7 +71,7 @@ jobs: if [[ ${{ matrix.package }} =~ uniond-release ]] then - mv ${{ matrix.system }}.${{ matrix.package }}/bin/uniond ${{ matrix.package }}-${{ matrix.system }} + mv ${{ matrix.system }}.${{ matrix.package }}/bin/uniond uniond-${{ matrix.system }} else mv ${{ matrix.system }}.${{ matrix.package }}/bin/${{ matrix.package }} ${{ matrix.package }}-${{ matrix.system }} fi
fix
rename `uniond` bin in release workflow (#1040) * fix(ci): rename uniond output bin * chore(ci): remove debug
3dd248d9fd37381c4aeff1378b8abfa976cb2482
2025-01-06 18:19:09
Lukas
fix(app): save progress
false
diff --git a/app/src/lib/components/TransferFrom/index.svelte b/app/src/lib/components/TransferFrom/index.svelte index 3778a1e9b0..508c0f36c8 100644 --- a/app/src/lib/components/TransferFrom/index.svelte +++ b/app/src/lib/components/TransferFrom/index.svelte @@ -1,9 +1,9 @@ <script lang="ts"> -import { TRANSFER_DEBUG } from "$lib/components/TransferFrom/config.ts" -import { createTransferStore } from "$lib/components/TransferFrom/transfer" -import DebugBox from "$lib/components/TransferFrom/components/DebugBox.svelte" + import { TRANSFER_DEBUG } from "$lib/components/TransferFrom/config.ts" + import { createTransferStore } from "$lib/components/TransferFrom/transfer" + import DebugBox from "$lib/components/TransferFrom/components/DebugBox.svelte" -const { intents, context, validation } = createTransferStore() + const { intents, context, validation } = createTransferStore() </script> <form @@ -21,12 +21,12 @@ const { intents, context, validation } = createTransferStore() id="source" name="source" placeholder="Enter source chain" - class="w-[300px] p-1 {$validation.source ? 'border-red-500' : ''}" + class="w-[300px] p-1 {$validation.errors.source ? 'border-red-500' : ''}" value={$intents.source} on:input={event => intents.updateField('source', event)} /> - {#if $validation.source} - <span class="text-red-500 text-sm">{$validation.source}</span> + {#if $validation.errors.source} + <span class="text-red-500 text-sm">{$validation.errors.source}</span> {/if} </div> @@ -37,12 +37,12 @@ const { intents, context, validation } = createTransferStore() id="destination" name="destination" placeholder="Enter destination chain" - class="w-[300px] p-1 {$validation.destination ? 'border-red-500' : ''}" + class="w-[300px] p-1 {$validation.errors.destination ? 'border-red-500' : ''}" value={$intents.destination} on:input={event => intents.updateField('destination', event)} /> - {#if $validation.destination} - <span class="text-red-500 text-sm">{$validation.destination}</span> + {#if $validation.errors.destination} + <span class="text-red-500 text-sm">{$validation.errors.destination}</span> {/if} </div> @@ -53,12 +53,12 @@ const { intents, context, validation } = createTransferStore() id="asset" name="asset" placeholder="Enter asset" - class="w-[300px] p-1 {$validation.asset ? 'border-red-500' : ''}" + class="w-[300px] p-1 {$validation.errors.asset ? 'border-red-500' : ''}" value={$intents.asset} on:input={event => intents.updateField('asset', event)} /> - {#if $validation.asset} - <span class="text-red-500 text-sm">{$validation.asset}</span> + {#if $validation.errors.asset} + <span class="text-red-500 text-sm">{$validation.errors.asset}</span> {/if} </div> @@ -80,12 +80,12 @@ const { intents, context, validation } = createTransferStore() data-field="amount" autocapitalize="none" pattern="^[0-9]*[.,]?[0-9]*$" - class="w-[300px] p-1 {$validation.amount ? 'border-red-500' : ''}" + class="w-[300px] p-1 {$validation.errors.amount ? 'border-red-500' : ''}" value={$intents.amount} on:input={event => intents.updateField('amount', event)} /> - {#if $validation.amount} - <span class="text-red-500 text-sm">{$validation.amount}</span> + {#if $validation.errors.amount} + <span class="text-red-500 text-sm">{$validation.errors.amount}</span> {/if} </div> @@ -101,13 +101,13 @@ const { intents, context, validation } = createTransferStore() spellcheck="false" autocomplete="off" data-field="receiver" - class="w-[300px] p-1 disabled:bg-black/30 {$validation.receiver ? 'border-red-500' : ''}" + class="w-[300px] p-1 disabled:bg-black/30 {$validation.errors.receiver ? 'border-red-500' : ''}" placeholder="Enter destination address" value={$intents.receiver} on:input={event => intents.updateField('receiver', event)} /> - {#if $validation.receiver} - <span class="text-red-500 text-sm">{$validation.receiver}</span> + {#if $validation.errors.receiver} + <span class="text-red-500 text-sm">{$validation.errors.receiver}</span> {/if} </div> </div> diff --git a/app/src/lib/components/TransferFrom/transfer/context.ts b/app/src/lib/components/TransferFrom/transfer/context.ts index fa113d98bc..102590150c 100644 --- a/app/src/lib/components/TransferFrom/transfer/context.ts +++ b/app/src/lib/components/TransferFrom/transfer/context.ts @@ -13,6 +13,7 @@ export type AddressBalance = { gasToken: boolean address: Address symbol: string + chain_id: string } export type NamedBalance = { @@ -21,22 +22,25 @@ export type NamedBalance = { name: string | null symbol: string gasToken: boolean + chain_id: string } -export type EmptyBalance = {} +export type EmptyBalance = { + chain_id: string +} export type Balance = AddressBalance | NamedBalance | EmptyBalance export interface ContextStore { chains: Array<Chain> - userAddress: Readable<UserAddresses> - sourceChain: Readable<Chain | undefined> - destinationChain: Readable<Chain | undefined> - balances: Readable<Array<Balance>> - assetInfo: Readable<Balance | undefined> + userAddress: UserAddresses + sourceChain: Chain | undefined + destinationChain: Chain | undefined + balances: Array<Balance> + assetInfo: Balance | undefined } -export function createContextStore(intents: IntentStore): ContextStore { +export function createContextStore(intents: IntentStore): Readable<ContextStore> { const queryClient = useQueryClient() function queryData<T extends Array<unknown>>( @@ -65,29 +69,24 @@ export function createContextStore(intents: IntentStore): ContextStore { chains.find(chain => chain.chain_id === intentsValue.destination) ) - // Balance data const balances = derived( [ intents, userAddress, userBalancesQuery({ chains, connected: true, userAddr: get(userAddress) }) ], - ([intentsValue, _userAddressValue, rawBalances]) => { - const sourceChain = chains.find(chain => chain.chain_id === intentsValue.source) - if (!sourceChain) return [] - - const chainIndex = chains.findIndex(c => c.chain_id === sourceChain.chain_id) - const balanceResult = rawBalances[chainIndex] - - if (!balanceResult?.isSuccess || balanceResult.data instanceof Error) { - console.log("No balances fetched yet for selected chain") - return [] - } - - return balanceResult.data.map(balance => ({ - ...balance, - balance: BigInt(balance.balance) - })) + ([_intentsValue, _userAddressValue, rawBalances]) => { + return rawBalances.flatMap((balanceResult, index) => { + const chain = chains[index] + if (!balanceResult?.isSuccess || balanceResult.data instanceof Error) { + return [] + } + return balanceResult.data.map(balance => ({ + ...balance, + balance: BigInt(balance.balance), + chain_id: chain.chain_id // Add chain_id to each balance + })) + }) } ) @@ -95,12 +94,15 @@ export function createContextStore(intents: IntentStore): ContextStore { balancesValue.find(x => x?.address === intentsValue.asset) ) - return { - chains, - userAddress, - sourceChain, - destinationChain, - balances, - assetInfo - } + return derived( + [userAddress, sourceChain, destinationChain, balances, assetInfo], + ([$userAddress, $sourceChain, $destinationChain, $balances, $assetInfo]) => ({ + chains, + userAddress: $userAddress, + sourceChain: $sourceChain, + destinationChain: $destinationChain, + balances: $balances, + assetInfo: $assetInfo + }) + ) } diff --git a/app/src/lib/components/TransferFrom/transfer/index.ts b/app/src/lib/components/TransferFrom/transfer/index.ts index ce57a11eee..c0b2731e97 100644 --- a/app/src/lib/components/TransferFrom/transfer/index.ts +++ b/app/src/lib/components/TransferFrom/transfer/index.ts @@ -1,23 +1,14 @@ +import { type Readable } from "svelte/store" import { createIntentStore, type IntentStore } from "./intents.ts" -import type { Readable } from "svelte/store" -import type { Chain, UserAddresses } from "$lib/types.ts" -import { type Balance, createContextStore } from "$lib/components/TransferFrom/transfer/context.ts" +import { type ContextStore, createContextStore} from "$lib/components/TransferFrom/transfer/context.ts" import { - createValidationStore, - type ValidationStore + createValidationStore, type ValidationStoreAndMethods } from "$lib/components/TransferFrom/transfer/validation.ts" export interface TransferStore { intents: IntentStore - context: { - chains: Array<Chain> - userAddress: Readable<UserAddresses> - sourceChain: Readable<Chain | undefined> - destinationChain: Readable<Chain | undefined> - balances: Readable<Array<Balance>> - assetInfo: Readable<Balance | undefined> - } - validation: ValidationStore + context: Readable<ContextStore> + validation: ValidationStoreAndMethods } export function createTransferStore(): TransferStore { @@ -30,4 +21,4 @@ export function createTransferStore(): TransferStore { context, validation } -} +} \ No newline at end of file diff --git a/app/src/lib/components/TransferFrom/transfer/validation.ts b/app/src/lib/components/TransferFrom/transfer/validation.ts index 55c805a5d1..e9451dfe06 100644 --- a/app/src/lib/components/TransferFrom/transfer/validation.ts +++ b/app/src/lib/components/TransferFrom/transfer/validation.ts @@ -1,14 +1,19 @@ -import type {Readable} from "svelte/store" -import {derived} from "svelte/store" -import type {IntentStore, FormFields, RawTransferIntents} from "./intents.ts" -import type {Chain} from "$lib/types" -import type {Balance, ContextStore} from "$lib/components/TransferFrom/transfer/context" -import {transferSchema} from "./schema.ts" -import {safeParse} from "valibot" +import type { Readable } from "svelte/store" +import { derived } from "svelte/store" +import type { IntentStore, FormFields, RawTransferIntents } from "./intents.ts" +import type { Chain } from "$lib/types" +import type { Balance, ContextStore } from "$lib/components/TransferFrom/transfer/context" +import { transferSchema } from "./schema.ts" +import { safeParse } from "valibot" export type FieldErrors = Partial<Record<keyof FormFields, string>> -export interface ValidationStore extends Readable<FieldErrors> { +export interface ValidationStore { + errors: FieldErrors + isValid: boolean +} + +export interface ValidationStoreAndMethods extends Readable<ValidationStore> { validate: () => Promise<boolean> } @@ -22,21 +27,12 @@ interface ValidationContext { export function createValidationStore( intents: IntentStore, - context: ContextStore -): ValidationStore { - const errors = derived< - [ - Readable<RawTransferIntents>, - Readable<Array<Balance>>, - Readable<Chain | undefined>, - Readable<Chain | undefined>, - Readable<Balance | undefined> - ], - FieldErrors - >( - [intents, context.balances, context.sourceChain, context.destinationChain, context.assetInfo], - ([$intents, $balances, $sourceChain, $destinationChain, $assetInfo]) => { - return validateAll({ + context: Readable<ContextStore> +): ValidationStoreAndMethods { + const store = derived( + [intents, context], + ([$intents, $context]) => { + const errors = validateAll({ formFields: { source: $intents.source, destination: $intents.destination, @@ -44,12 +40,17 @@ export function createValidationStore( receiver: $intents.receiver, amount: $intents.amount }, - balances: $balances, - sourceChain: $sourceChain, - destinationChain: $destinationChain, - assetInfo: $assetInfo, - chains: context.chains + balances: $context.balances, + sourceChain: $context.sourceChain, + destinationChain: $context.destinationChain, + assetInfo: $context.assetInfo, + chains: $context.chains }) + + return { + errors, + isValid: Object.keys(errors).length === 0 + } } ) @@ -104,37 +105,12 @@ export function createValidationStore( }) } - function validateSchema(params: FormFields): FieldErrors { - if (Object.values(params).every(value => !value)) { - return {} - } - - const result = safeParse(transferSchema, params) - - if (!result.success) { - return result.issues.reduce((acc, issue) => { - const fieldName = issue.path?.[0]?.key as keyof FormFields - - if (fieldName && !params[fieldName]) { - return acc - } - - if (fieldName) { - acc[fieldName] = issue.message - } - return acc - }, {} as FieldErrors) - } - - return {} - } - function validateBusinessRules(formFields: FormFields, context: ValidationContext): FieldErrors { if (Object.values(formFields).every(value => !value)) { return {} } const errors: FieldErrors = {} - + if (formFields.source && formFields.destination && formFields.source === formFields.destination) { errors.destination = "Source and destination chains must be different" } @@ -143,18 +119,17 @@ export function createValidationStore( } return { - subscribe: errors.subscribe, + subscribe: store.subscribe, validate: () => { return new Promise(resolve => { - let currentErrors: FieldErrors = {} - const unsubscribe = errors.subscribe(value => { - currentErrors = value + let currentState: ValidationStore + const unsubscribe = store.subscribe(value => { + currentState = value }) - - const isValid = Object.keys(currentErrors).length === 0 + const isValid = currentState!.isValid unsubscribe() resolve(isValid) }) } } -} +} \ No newline at end of file
fix
save progress
f3d17a7f03de0f1ba5b022242a4c27fb6b047bd4
2024-02-06 19:52:57
Hussein Ait Lahcen
feat(uniond): upgrade wasmvm to 1.5.2
false
diff --git a/flake.lock b/flake.lock index 2eff97ac8b..8fae06489c 100644 --- a/flake.lock +++ b/flake.lock @@ -675,16 +675,16 @@ "wasmvm": { "flake": false, "locked": { - "lastModified": 1698746477, - "narHash": "sha256-l0cNF0YjviEl/JLJ4VdvDtIGuAYyFfncVo83ROfQFD8=", + "lastModified": 1705576719, + "narHash": "sha256-3KJq5jFllFSqlm85/iRWYMhu99iuokvR3Ib9Gq3gIjc=", "owner": "CosmWasm", "repo": "wasmvm", - "rev": "2041b184c146f278157d195361bc6cc6b56cc9d4", + "rev": "b742b2623cce98f04ae5d8bfb488c73988f3dd61", "type": "github" }, "original": { "owner": "CosmWasm", - "ref": "v1.5.0", + "ref": "v1.5.2", "repo": "wasmvm", "type": "github" } diff --git a/flake.nix b/flake.nix index 70bb1a7b99..8958d95243 100644 --- a/flake.nix +++ b/flake.nix @@ -67,7 +67,7 @@ nix-filter.url = "github:numtide/nix-filter"; get-flake.url = "github:ursi/get-flake"; wasmvm = { - url = "github:CosmWasm/wasmvm/v1.5.0"; + url = "github:CosmWasm/wasmvm/v1.5.2"; flake = false; }; oxlint = { diff --git a/uniond/go.mod b/uniond/go.mod index d7081d5159..5e0c51316e 100644 --- a/uniond/go.mod +++ b/uniond/go.mod @@ -16,7 +16,7 @@ require ( cosmossdk.io/x/tx v0.12.0 cosmossdk.io/x/upgrade v0.1.0 github.com/CosmWasm/wasmd v0.50.0 - github.com/CosmWasm/wasmvm v1.5.0 + github.com/CosmWasm/wasmvm v1.5.2 github.com/cometbft/cometbft v0.38.2 github.com/cosmos/cosmos-db v1.0.0 github.com/cosmos/cosmos-sdk v0.50.2 diff --git a/uniond/go.sum b/uniond/go.sum index 8aebdb71bf..195905647e 100644 --- a/uniond/go.sum +++ b/uniond/go.sum @@ -232,8 +232,8 @@ github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03 github.com/BurntSushi/xgb v0.0.0-20160522181843-27f122750802/go.mod h1:IVnqGOEym/WlBOVXweHU+Q+/VP0lqqI8lqeDx9IjBqo= github.com/CosmWasm/wasmd v0.50.0 h1:NVaGqCSTRfb9UTDHJwT6nQIWcb6VjlQl88iI+u1+qjE= github.com/CosmWasm/wasmd v0.50.0/go.mod h1:UjmShW4l9YxaMytwJZ7IB7MWzHiynSZP3DdWrG0FRtk= -github.com/CosmWasm/wasmvm v1.5.0 h1:3hKeT9SfwfLhxTGKH3vXaKFzBz1yuvP8SlfwfQXbQfw= -github.com/CosmWasm/wasmvm v1.5.0/go.mod h1:fXB+m2gyh4v9839zlIXdMZGeLAxqUdYdFQqYsTha2hc= +github.com/CosmWasm/wasmvm v1.5.2 h1:+pKB1Mz9GZVt1vadxB+EDdD1FOz3dMNjIKq/58/lrag= +github.com/CosmWasm/wasmvm v1.5.2/go.mod h1:Q0bSEtlktzh7W2hhEaifrFp1Erx11ckQZmjq8FLCyys= github.com/DataDog/datadog-go v3.2.0+incompatible h1:qSG2N4FghB1He/r2mFrWKCaL7dXCilEuNEeAn20fdD4= github.com/DataDog/datadog-go v3.2.0+incompatible/go.mod h1:LButxg5PwREeZtORoXG3tL4fMGNddJ+vMq1mwgfaqoQ= github.com/DataDog/zstd v1.5.5 h1:oWf5W7GtOLgp6bciQYDmhHHjdhYkALu6S/5Ni9ZgSvQ= diff --git a/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.aarch64.so b/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.aarch64.so index 6733e6c2e7..6e94c112f6 100644 Binary files a/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.aarch64.so and b/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.aarch64.so differ diff --git a/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.dylib b/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.dylib index 27dbe8c00c..0e0cecced7 100644 Binary files a/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.dylib and b/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.dylib differ diff --git a/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.x86_64.so b/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.x86_64.so index 2fa089dff4..7a38617c2a 100644 Binary files a/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.x86_64.so and b/uniond/vendor/github.com/CosmWasm/wasmvm/internal/api/libwasmvm.x86_64.so differ diff --git a/uniond/vendor/modules.txt b/uniond/vendor/modules.txt index 41c77b335f..336f9de2e1 100644 --- a/uniond/vendor/modules.txt +++ b/uniond/vendor/modules.txt @@ -217,7 +217,7 @@ github.com/CosmWasm/wasmd/x/wasm/migrations/v2 github.com/CosmWasm/wasmd/x/wasm/migrations/v3 github.com/CosmWasm/wasmd/x/wasm/simulation github.com/CosmWasm/wasmd/x/wasm/types -# github.com/CosmWasm/wasmvm v1.5.0 +# github.com/CosmWasm/wasmvm v1.5.2 ## explicit; go 1.18 github.com/CosmWasm/wasmvm github.com/CosmWasm/wasmvm/internal/api
feat
upgrade wasmvm to 1.5.2
4389cb2cc3230c6635e00bd6917b8eb59fa23d2b
2024-08-11 16:12:48
omar
chore: update ts sdk examples
false
"diff --git a/typescript-sdk/.env.example b/typescript-sdk/.env.example\nindex 7c4b7d254d..8b3d7d4fe(...TRUNCATED)
chore
update ts sdk examples
5d22acb9070f543cd1df87a70f27db4ce3d99657
2025-02-25 22:32:03
cor
chore(app2): cleanup
false
"diff --git a/app2/src/generated/graphql-env.d.ts b/app2/src/generated/graphql-env.d.ts\nindex 58877(...TRUNCATED)
chore
cleanup
f7a3d353adb5ed3d798f08c70da9b34bf3d29596
2023-06-16 17:20:50
aeryz
chore(spellcheck): exclude tests Signed-off-by: aeryz <abdullaheryz@protonmail.com>
false
"diff --git a/cspell.json b/cspell.json\nindex 6090a20e1f..1dc680fa41 100644\n--- a/cspell.json\n+++(...TRUNCATED)
chore
exclude tests Signed-off-by: aeryz <abdullaheryz@protonmail.com>
6f371322c89d66cb93c428c39ef2bc007610e01a
2024-02-02 01:33:05
Hussein Ait Lahcen
fix(cometbls-groth16-verifier): verify pedersen commitment
false
"diff --git a/lib/cometbls-groth16-verifier/src/lib.rs b/lib/cometbls-groth16-verifier/src/lib.rs\ni(...TRUNCATED)
fix
verify pedersen commitment
75df1e64b1ec1493c8f7187e84b76477fbdf66d2
2024-11-08 19:26:45
o-az
chore: prevent npm install usage
false
"diff --git a/typescript-sdk/.npmrc b/typescript-sdk/.npmrc\nindex 6018a1cb5f..f70cfbc4d3 100644\n--(...TRUNCATED)
chore
prevent npm install usage
End of preview.

No dataset card yet

Downloads last month
27