Git Product home page Git Product logo

nautilus's Introduction


nautilus logo

The Data Economy TypeScript Toolkit

Version Apache-2.0 License GitHub Actions Workflow Status

A TypeScript library enabling you to explore the Data Economy. It is built on top of ocean.js and offers feature complete, automated interactions with any Ocean Protocol ecosystem.

Overview

nautilus addresses many common pain points faced by developers interacting with the data economy by offering a range of features enhancing productivity and efficiency. You will find a quick introduction on this page to get you setup with the Data Economy TypeScript Toolkit.

Looking for dedicated feature documentations? Follow the links below:

Quick Start

1. Setup your Signer

Firstly, create the signer you want to use with your nautilus instance. nautilus uses the ethers.js Signer. You can read more about possible configurations in the official documentation.

import { Wallet, providers } from 'ethers'
import { Nautilus } from '@deltadao/nautilus'

const provider = new providers.JsonRpcProvider('https://rpc.dev.pontus-x.eu') 
const signer = new Wallet('0x...', provider) 

In this example we create an ethers Wallet from a given private key and connect to a RPC provider of our choice.

2. Setup the nautilus instance

Now that you have a Signer set up, you can use it to bootstrap your nautilus client instance.

import { Wallet, providers } from 'ethers'
import { Nautilus } from '@deltadao/nautilus'

const provider = new providers.JsonRpcProvider('https://rpc.dev.pontus-x.eu')
const signer = new Wallet('0x...', provider)

const nautilus = await Nautilus.create(signer) 

Note, that we use the previously created Wallet and pass it to Nautilus to create the instance with this signer.

3. Interact with the data economy

With the client instance bootstrapped you can now trigger any transactions or access calls supported by OceanProtocol.

import { Wallet, providers } from 'ethers'
import { Nautilus } from '@deltadao/nautilus'

const provider = new providers.JsonRpcProvider('https://rpc.dev.pontus-x.eu')
const signer = new Wallet('0x...', provider)

const nautilus = await Nautilus.create(signer)

const accessUrl = await nautilus.access({ assetDid: 'did:op:12345'}) 
const data = await fetch(accessUrl) 

In this example we construct a one-time accessUrl and can then use it to fetch the data associated with the respective data service.

Next Steps

Find dedicated feature documentation by following one of the links below:

If you want to jump straight into code, feel free to take a look at some of our code examples in the nautilus-examples repository.

License

Copyright ((C)) 2023 deltaDAO AG

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

   http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

nautilus's People

Contributors

abrom8 avatar dependabot[bot] avatar github-actions[bot] avatar moritzkirstein avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

w1kke bjbe82

nautilus's Issues

[Feature] Automated CtD

Motivation / Problem

Add automated compute-to-data jobs

Solution

interface ComputeConfigOptions {
  datasetServiceParams?: any
  algorithmServiceParams?: any
  algoCustomData?: any
}

interface ComputeConfig {
  datasetDid: string, 
  algorithmDid: string, 
  web3: Web3,
  options?: ComputeConfigOptions
}  

async function compute(config: ComputeConfig): string

async function retrieveResult(jobId: string)
  • for ordering of ctd jobs, see Market order Flow
  • retrieveResult stores result & logs in RESULT_FOLDER

Alternatives

Additional context

[Feature] Provide "edit" asset function

Motivation / Problem

Current state "Publishing":

classDiagram
    note for DDO "ocean.js spec DDO"
    AssetBuilder *-- NautilusAsset
    NautilusAsset *-- NautilusDDO
    NautilusDDO *-- DDO
    class NautilusAsset{
        +NautilusDDO nautilusDDO
    }
    class AssetBuilder{
        +NautilusAsset nautilusAsset
    }
    class NautilusDDO{
        +DDO ddo
    }
    class DDO{
        +object metadata
        +service[] services
    }
Loading

We want to be able to reuse publishing for editing functionality.

Solution

  • Create constructors to allow pre-loading the DDO object of NautilusDDO with aquarius metadata
    • NautilusDDO: static createFromDDO(ddo: DDO)
    • NautilusAsset
    • AssetBuilder
  • Helper function to create NautilusDDO / NautilusAsset from a given did
  • New Nautilus.edit() function that re-uses the existing publishing flow
  • Tests if edits / updates are overwritten (applied) correctly
    • Metadata changes
    • Service changes
    • ...
  • confirm test-edits with asset queries (hard because of aquarius delay, maybe we can trigger caching, see: https://docs.oceanprotocol.com/developers/aquarius/asset-requests#trigger-caching)

[Feature] Add typedoc documentation

Motivation / Problem

We need to provide proper documentation for the library

Solution

  • Add documentation to exported functions
  • Add documentation setup & config

[Enhancement] Refactor compute/index

Motivation

We started with compute functionality and have a lot of code currently located in the compute file that should be extracted.

Solution

Refactor the compute/index.ts file

  • extract all functions that are reused in other files
  • shorten the compute function to be more readable
  • extract functions that could be used in other sections (e.g. subgraph / aqua / provider calls)

[Feature] Enhance Nautilus Class

Motivation / Problem

Make all supported functionalities available on nautilus class

Solution

  • publish
  • #7
  • access
  • always use nautilus configs when triggering these functions
  • #10

[BUG] Publishing dataset fails due to 'exceeds block gas limit' error

Summary

When trying to publish a dataset to the Gen-X network with the given configs a web3 error is thrown.

Current Behavior

The publishing doesn't work and the error 'exceeds block gas limit' is thrown

Expected Behavior

The dataset is published to the DLT and can be seen in the portal.

Steps to Reproduce

Network_Config:

{
network: 'genx',
chainId: 100,
metadataCacheUri: 'https://aquarius510.v4.delta-dao.com',
nodeUri: 'https://rpc.genx.minimal-gaia-x.eu',
providerUri: 'https://provider.v4.genx.delta-dao.com',
subgraphUri: 'https://subgraph.v4.genx.minimal-gaia-x.eu',
oceanTokenAddress: '0x0995527d3473b3a98c471f1ed8787acd77fbf009',
oceanTokenDecimals: 18,
oceanTokenSymbol: 'OCEAN',
fixedRateExchangeAddress: '0xAD8E7d2aFf5F5ae7c2645a52110851914eE6664b',
dispenserAddress: '0x94cb8FC8719Ed09bE3D9c696d2037EA95ef68d3e',
nftFactoryAddress: '0x6cb85858183b82154921f68b434299ec4281da53',
providerAddress: '0x68C24FA5b2319C81b34f248d1f928601D2E5246B',
euroeAddress: '0xe974c4894996e012399dedbda0be7314a73bbff1',
euroeDecimals: 6
}

Pricing_Config:

type: 'fixed',
freCreationParams: {
    fixedRateAddress: config.fixedRateExchangeAddress,
    baseTokenAddress: config.euroeAddress,
    baseTokenDecimals: config.euroeDecimals,
    datatokenDecimals: 18,
    fixedRate: '1',
    marketFee: '0',
    marketFeeCollector: '0x0000000000000000000000000000000000000000'
}

Error message:

\service-pub\node_modules\web3\node_modules\web3-core\lib\commonjs\web3_request_manager.js:258
throw new web3_errors_1.InvalidResponseError(response, payload);
^

InvalidResponseError: Returned error: exceeds block gas limit
at Web3RequestManager._processJsonRpcResponse (\service-pub\node_modules\web3\node_modules\web3-core\lib\commonjs\web3_request_manager.js:258:23)
at Web3RequestManager. (\service-pub\node_modules\web3\node_modules\web3-core\lib\commonjs\web3_request_manager.js:156:29)
at Generator.next ()
at fulfilled (\service-pub\node_modules\web3\node_modules\web3-core\lib\commonjs\web3_request_manager.js:21:58)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
innerError: { code: -32600, message: 'exceeds block gas limit' },
code: 101,
data: undefined,
request: {
jsonrpc: '2.0',
id: '2a351139-c976-447a-8c48-c58e6a1711d0',
method: 'eth_sendRawTransaction',
}
}

Environment

  • OS: Windows 11
  • Node: 18.15.0
  • npm: 9.6.4

Anything else

  • It's not account dependent. I tried two different accounts for publishing and they both create this error.

[Enhancement] Create API Docs for all relevant classes

Motivation / Problem

Current docs only have API documentation for Nautilus and AssetBuilder.
We also document NautilusAsset which is only used internally.

Solution

Add missing docs (e.g. ServiceBuilder, ConsumerParameterBuilder) and remove internal classes from documentation

[BUG] getAquariusAsset not working well

Summary

The getAquariusAsset function failed when trying to fetch the Aquarius cache but seems to be using a different aquarius URI.

Current Behavior

In the logs I get:

Retrieve asset did:op:86bb40c1d6702cf7131837c051282b984e68d59eed6526b01f2380f7f25c645f  using cache 
at https://aquarius.v4.delta-dao.com
Request failed with status code 404
Error: getAquariusAsset failed: AxiosError: Request failed with status code 404

Note that the cache showed in the logs is different from the one I provided in my configuration file. I think, somewhere in the code, this cache is used statically. The desired aquarius cache URI should be: https://aquarius510.v4.delta-dao.com.

Even when I make a simple get request of any dataset using the previous cache URI : https://aquarius.v4.delta-dao.com, I got 404 not found.

Environment

  • OS: Windows 10 Enterprise
  • Node: 16.20.0
  • npm: 8.19.4

[Feature] Create NautilusDDO wrapper class

Motivation / Problem

We want to enable editing from within Nautilus.
#10

Solution

We need to further split up logic in publish to easily enable edit functionality.
For this we should create a NautilusDDO class providing a wrapper for the ocean.js DDO.

This should add additional functionality to the DDO:

  • move metadata from NautilusAsset to NautilusDDO
  • move services from NautilusAsset to NautilusDDO
  • construct ocean.js ddo using NautilusDDO on publish:
    • helper function, e.g. getDDO(), which returns NautilusDDO.ddo
    • helper function to construct the ddo using metadata & services, e.g. buildDDO. ddo can contain an already published DDO, in that case the configs in metadata and services will overwrite previous data
    • getDDO returns
      • buildDDO() if metadata | services != undefined
      • ddo else
classDiagram
    AssetBuilder *-- NautilusAsset
    NautilusAsset *-- NautilusDDO
    class NautilusDDO{
        -ServiceConfig services
        -MetadataConfig metadata
        -DDO ddo
        -buildDDO()
        +getDDO() DDO
    }
Loading

[BUG] Error [ERR_PACKAGE_PATH_NOT_EXPORTED] with 0.2.0

Summary

Upgraded Nautilus to 0.2.0 in a running project. Now not able to execute without any further change to the original code.

Current Behavior

Getting error:

Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './dist/src/@types' is not defined by "exports" in .../nautilus/node_modules/@deltadao/nautilus/package.json

By the way, dist/src/index.d.ts content is:

export * from './access';
export * from './compute';
export * from './publish';
export * from './Nautilus';

Environment

"@deltadao/nautilus": "^0.2.0",
"dotenv": "^16.3.1",
"ts-node": "^10.9.1",
"typescript": "^5.0.2",
"web3": "^1.9.0"

[Enhancement] Refactor publish

Motivation / Problem

Publish needs to be usable in multiple scenarios: initial asset creation, as well as updating.

Solution

  • Refactor publish to use ocean.js types (e.g., correctly prepared DDO, instead of metadata + services)
  • Extract specific functionality for DDO preparation step(s) to NautilusDDO wrapper class
  • Update publish function to use new NautilusDDO

[BUG] Algo consumerParameters are reset on edit

Summary

Editing an algorithm offering resets the consumerParameters in metadata.algorithm

Current Behavior

When editing an algorithm offering the consumerParameters in metadata.algorithm are reset and empty.

Expected Behavior

When editing an algorithm offering the consumerParameters in metadata.algorithm persist from loaded aquariusAsset

[Enhancement] Improve error message on access calls

Motivation / Problem

When trying to access an asset of type != 'access', the function call fails with a long stack trace.

Solution

We should imrpove the error handling and check if we can just verify the correct type of asset, even before doing the provider calls.

Alternatives

Additional context

Error trace

Creating new Nautilus instance with signer 0x...
[aquarius] Retrieve asset did:op:f2163fe539c72bf255fe96c8af32ae57b16ed4db76f71555d2401db0ec2dfb54 using cache at https://aquarius510.v4.delta-dao.com/
Error getting access details:  Cannot read properties of undefined (reading 'token')
[access] AccessDetails: undefined
Error initializing provider for access!
TypeError: Cannot read properties of undefined (reading 'id')
    at C:\node_typescript\node_modules\@deltadao\nautilus\src\utils\provider.ts:61:15
    at _catch (C:\node_typescript\node_modules\@deltadao\nautilus\src\utils\order.ts:251:2)
    at initializeProvider (C:\node_typescript\node_modules\@deltadao\nautilus\src\utils\provider.ts:56:44)
    at C:\node_typescript\node_modules\@deltadao\nautilus\src\access\index.ts:57:32
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at async access (C:\node_typescript\utils\access.ts:4:21)
TypeError: Cannot read properties of null (reading 'providerFee')
    at C:\node_typescript\node_modules\@deltadao\nautilus\src\access\index.ts:69:20
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

[BUG] GetComputeResult return before finishing the job

Summary

I tried to perform a compute flow with nautilus. When calling the getComputeResult in my function, it doesn't finish the job and return. I implemented a polling system which works fine, but is not nice to have. I hard coded the status 70 which refers to a finished job.

Current Behavior

Before implementing the polling system I tried to update the examples provided here to fit to my application.
When not using the Polling system, the application will send me back:

[compute] Retrieve results: job does not exist or is not yet finished.

Here the code:

export async function computeDataset(req : Request, res : Response){
    const { datasetId, algorithmId} = req.body;
    try{
        const nautilus = await Nautilus.create(wallet, networkConfig);
        const dataset = {
            did : datasetId
        }
        const algorithm = {
            did : algorithmId
        }
        const computeJob = await nautilus.compute({
            dataset,
            algorithm
        });
        console.log("Compute result", computeJob)
        const computeResult = Array.isArray(computeJob) ? computeJob[0] : computeJob
        const { jobId } = computeResult;
        console.log(jobId);
        await getComputeStatus(nautilus, networkConfig.providerUri, jobId);
        const result = await retrieveComputeResult(nautilus, networkConfig.providerUri, jobId);

        console.log(result)

        // const data = computeResultUrl && await fetch(computeResultUrl)
        // console.log(data)
        //res.status(200).json({message : "Compute results", computeResult, computeJobStatus})
    }catch(error){
        console.error(error);
        res.status(500).json({ message : "Problem occured"})
    }
}


async function getComputeStatus(
    nautilus : Nautilus,
    providerUri: string,
    jobId: string){
    
    const computeJobStatus = await nautilus.getComputeStatus({
        jobId: jobId,
        providerUri
    });

    console.log(computeJobStatus);

}

async function retrieveComputeResult(
    nautilus : Nautilus,
    providerUri : string,
    jobId:  string
) {
  return  await nautilus.getComputeResult({
    jobId: jobId,
    providerUri
  });
}

Here my code with the polling system:

async function getComputeStatus(nautilus: Nautilus, providerUri: string, jobId: string) {
    const computeJobStatus = await nautilus.getComputeStatus({ jobId, providerUri });
    console.log(computeJobStatus);
    return computeJobStatus;
}

async function waitForComputeCompletion(nautilus: Nautilus, providerUri: string, jobId: string) {
    const delay = (ms: number) => new Promise(resolve => setTimeout(resolve, ms));

    while (true) {
        const status = await getComputeStatus(nautilus, providerUri, jobId);

        if (status.status === 70) {
            break;
        }

        console.log(`Current status: ${status.status}. Waiting...`);
        await delay(5000); // Wait for 5 seconds before checking again
    }
}

async function retrieveComputeResult(nautilus: Nautilus, providerUri: string, jobId: string) {
    return await nautilus.getComputeResult({ jobId, providerUri });
}

Is there a better solution to handle the Promises instead of doing the polling system ?

I call these functions in another function which I then exposed to the routes of my API.

export async function computeDataset(req: Request, res: Response) {
    const { datasetId, algorithmId } = req.body;
    try {
        const nautilus = await Nautilus.create(wallet, networkConfig);
        const dataset = { did: datasetId };
        const algorithm = { did: algorithmId };

        const computeJob = await nautilus.compute({ dataset, algorithm });
        console.log("Compute result", computeJob);

        const computeResult = Array.isArray(computeJob) ? computeJob[0] : computeJob;
        const { jobId } = computeResult;
        console.log(jobId);

        await waitForComputeCompletion(nautilus, networkConfig.providerUri, jobId);

        const result = await retrieveComputeResult(nautilus, networkConfig.providerUri, jobId);
        console.log(result);

        res.status(200).json({ message: "Compute results", result });
    } catch (error) {
        console.error(error);
        res.status(500).json({ message: "Problem occurred" });
    }
}

Expected Behavior

I think it should work fine as in the example provided here . The compute job should completed before the results are sent back.

Steps to Reproduce

I am using Express to expose an API.
I stored my network configuration in a variable but you can call it directly.

Environment

  • OS: Windows 10 Entreprise
  • Node: 18.20.2
  • npm: 10.5.0

Anything else

[Feature] Provide compute functions via Nautilus class

Motivation / Problem

We need to make compute functions available via Nautilus class for easier use

Solution

Make exports "internal" and add proxy functions to Nautilus class

  • compute
  • computeStatus
  • retrieveResults

Alternatives

Additional context

[Feature] Provide an Example how a SaaS Service description should be done via Nautilus

Motivation / Problem

At the moment, it is a little bit trial and error to create a SaaS service description with nautilus. After the latest update it looks like a file type is mandatory for ServiceBuilder which is not really useful for a SaaS.

Tried:

export async function publishAnalysisStack(
  nautilus: Nautilus,
  networkConfig: NetworkConfig,
  pricingConfig: PricingConfigWithoutOwner,
  wallet: Wallet,
) {
  const owner = await wallet.getAddress();
  console.log(`The owner address is ${owner}`);

  const service = createService(networkConfig, pricingConfig);
  console.log('Created Service: ', service);

  const asset = createAsset(owner, service);
  console.log('Created Asset: ', asset);

  return nautilus.publish(asset);
}

const createService = (
  networkConfig: NetworkConfig,
  pricingConfig: PricingConfigWithoutOwner,
) => {
  const serviceBuilder = new ServiceBuilder({
    serviceType: ServiceTypes.COMPUTE,
    fileType: undefined,
  });

  return serviceBuilder
    .setServiceEndpoint(networkConfig.providerUri)
    .setTimeout(0)
    .setPricing(pricingConfig)
    .setDatatokenNameAndSymbol('COOP Analysis Stack', 'CAS')
    .setName('COOP Analysis Stack')
    .setDescription(
      'COOP Analysis Stack description. This is a placeholder for a more detailed description.',
    )
    .build();
};

const createAsset = (
  owner: string,
  service: NautilusService<ServiceTypes, FileTypes>,
) => {
  const assetBuilder = new AssetBuilder();

  return (
    assetBuilder
      .setType('algorithm')
      .setName('ScopeSET Analysis Stack')
      // supports markdown -> add some more nice text later
      .setDescription(
        `# ScopeSET Analysis Stack \n\n
        This asset has been published using [Nautilus](https://github.com/deltaDAO/nautilus).`,
      )
      .setAuthor('ScopeSET')
      .setLicense('MIT')
      .addService(service)
      .addAdditionalInformation({
        //To force the asset to be displayed as SaaS the next entry is required!
        //see https://github.com/deltaDAO/nautilus/issues/59
        saas: {
          redirectUrl: 'https://saas.helpdesk.de/',
        },
      })
      .setOwner(owner)
      .build()
  );
};

--> Error: Required attributes are missing to create a valid Ocean DDO

Using something like this:

  const serviceBuilder = new ServiceBuilder({
    serviceType: ServiceTypes.ACCESS,
    fileType: FileTypes.URL,
  });

  //we need some file as it looks like ....
  const fakeURL: UrlFile = {
    type: 'url',
    url: 'https://raw.githubusercontent.com/deltaDAO/nautilus-examples/main/example_publish_assets/example-dataset.json', // link to your file or api
    method: 'GET',
  };

  return serviceBuilder
    .setServiceEndpoint(networkConfig.providerUri)
    .setTimeout(0)
    .setPricing(pricingConfig)
    .addFile(fakeURL)
    .setDatatokenNameAndSymbol('COOP Analysis Stack', 'CAS')
    .setName('COOP Analysis Stack')
    .setDescription(
      'COOP Analysis Stack description. This is a placeholder for a more detailed description.',
    )
    .build();

works somehow.

[Enhancement] Restructure Types

Motivation / Problem

We now have need for more types than initially expected and need to refactor/restructure the src/@types folder

Solution

[BUG] empty computeStatus response caused by different provider

Nautilus asks the configured provider and not the asset's provider for its status. This results in an empty/undefined response if the provider of the assets is different from the nautilus config:

    algoDID: 'did:op:926098d058b017dcf3736370f3c3d77e6046ca6622af111229accf5f9c83e308',
    inputDID: [
      'did:op:55939e969f36d6879d1f9c1ebd0b51f8a4d7c0673a84c050b21224840e997ec7'
    ]
[compute] Retrieve job status: {
  jobId: 'e2b3c4445d7e4ee18ce38ed4311b82a1',
  providerUri: 'https://provider.dev.pontus-x.eu', // should be https://dd1.provider.dev.pontus-x.eu
  account: '0x96F7eD242f58FAbb0d4EB5AD6C8301582447d136'
}

Should be defined:

[compute] computeStatus response:  []
Compute Job Status:  undefined

[Enhancement] More/refined publish builders

Motivation / Problem

For things like pricing, datatokens / nfts and services we may want to provide builder classes just as we do with the assetBuilder itself

Solution

Decide on a setup for version 1. Does not need to be the end all solution, but should provide an initial UX that we are happy with.

e.g.,

// we could add a service builder
const service = serviceBuilder
  .addFile(file)
  .setType('access')
  .setEndpoint(providerUri)
  .build()

const asset = assetBuilder
  .addService(service)
  .setPricing({
    // simplify price construction for enduser
    type: 'fixed'
    baseToken: '0x...',
    price: '10'
  })
  //...
  .build()
  • ServiceBuilder
  • ConsumerParameterBuilder
  • Enhanced Pricing Solution
  • Enhanced Datatoken/NFT solution

Alternatives

The above proposal is an example. We might determine other useful builders or won't implement some of them for the first release.

[Feature] NautilusDDO Tests

Motivation / Problem

We created a new class that needs testing

Solution

Refactor existing tests of moved functionality, and create new tests for NautilusDDO wrapper

[BUG] Cannot create Service with parameter

Summary

I tried to create a SaaS service description, which produces multiple errors when using the Nautilus API. To dig down what the error is i replaced all the code with the provided example code to publish a service see below. The only thing changed was to add a parameter.

export async function publishAnalysisStack(
  nautilus: Nautilus,
  networkConfig: NetworkConfig,
  pricingConfig: PricingConfigWithoutOwner,
  wallet: Wallet,
) {
  const owner = await wallet.getAddress();
  console.log(`The owner address is ${owner}`);
  const serviceBuilder = new ServiceBuilder({
    serviceType: ServiceTypes.ACCESS,
    fileType: FileTypes.URL,
  }); // access type dataset with URL data source

  const urlFile: UrlFile = {
    type: 'url', // there are multiple supported data source types, see https://docs.oceanprotocol.com/developers/storage
    url: 'https://raw.githubusercontent.com/deltaDAO/nautilus-examples/main/example_publish_assets/example-dataset.json', // link to your file or api
    method: 'GET', // HTTP request method
  };

  const consumerParameterBuilder = new ConsumerParameterBuilder();
  const petitionerParam = consumerParameterBuilder
    .setType('text')
    .setName('Public Address')
    .setLabel('Public Address Label')
    .setDescription('Web3 Public Address of the petitioner.')
    .setRequired(true)
    .build();

  const service = serviceBuilder
    .setServiceEndpoint(networkConfig.providerUri)
    .setTimeout(60)
    .addFile(urlFile)
    .setPricing(pricingConfig)
    .addConsumerParameter(petitionerParam)
    .setDatatokenNameAndSymbol('My Datatoken Name', 'SYMBOL') // important for following access token transactions in the explorer
    .build();

  const assetBuilder = new AssetBuilder();
  const asset = assetBuilder
    .setType('dataset')
    .setName('Nautilus-Example: Access Dataset Name')
    .setDescription(
      '# Nautilus-Example Description \n\nThis asset has been published using the [nautilus-examples](https://github.com/deltaDAO/nautilus-examples) repository.',
    )
    .setAuthor('Company Name')
    .setLicense('MIT')
    .addService(service)
    .setOwner(owner)
    .build();

  const result = await nautilus.publish(asset);
  console.log(result);
}

Current Behavior

Trying to execute the code throws an error:

Error: Validating Metadata failed: [object Object]
    at E:\Arbeit\COOP\coop-stack\gaia-x-nautilus\node_modules\@deltadao\nautilus\dist\lib.js:1:26030
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async publishAnalysisStack (E:\Arbeit\COOP\coop-stack\gaia-x-nautilus\dist\publish-analysis-stack.js:43:20)
    at async main (E:\Arbeit\COOP\coop-stack\gaia-x-nautilus\dist\index.js:42:20)

Expected Behavior

A Service description with parameter is available in the portal.

Environment

  • "@deltadao/nautilus": "1.0.0-beta.1",
  • "dotenv": "^16.3.1",
  • "ethers": "^5.7.2"
  • Node: 20.10.0
  • npm: 10.2.3
  • Windows 10

[Feature] "Default" Pricing configs

Motivation / Problem

We need pre-defined configs for network & opf supported tokens

  • fixedRateAddresses, Dispensers etc
  • e.g. OCEAN / EUROe

Solution

  • Update AssetBuilder.setPricing()

Alternatives

Additional context

[Feature] Automated Publishing

Motivation / Problem

Add automated publishing of assets

Solution

type DatatokenConfig = { name: string; symbol: string }
type Pricing = 'free' | { fixedRate: number, marketFee?: number }
interface AssetConfig {
  metadata: Metadata | string
  service: Service | string
  web3: Web3
  pricing: Pricing
  datatoken?: DatatokenConfig
}

async function publishAsset(config: AssetConfig)
  • Create function publishAsset with the following parameters
    • Metadata: DDO metadata as object or filepath
    • Service: DDO service as object or filepath
    • Web3: used to sign tsx for publish flow
    • Datatoken (optional): configuration for the datatoken of the service to be published
  • metadata.description allowed as raw string or filepath to .md file
  • Support fixed & free pricing
  • use ocean.js to publish asset:

Alternatives

Additional context

[BUG][0.3.0-alpha4] access error

Nautilus Version: 0.3.0-alpha4

Error:

TypeError: Cannot read properties of undefined (reading '1')

Code:

async function main() {
    Nautilus.setLogLevel(LogLevel.Verbose)
    const nautilus = await Nautilus.create(web3, genx_config )
    console.log(nautilus.getOceanConfig())

    await access(nautilus)
}

async function access(nautilus) {

    const accessUrl = await nautilus.access({
        assetDid: 'did:op:8ca612cfc6ffac030ca1a0f3d63f2ba72ec6e409d487f19a815b43bc11160b34'
    })
    console.log(accessUrl)
}

[Enhancement] add missing DDO attributes

Motivation / Problem

tags and copyrightHolder are missing in the assetBuilder

TODO: check for other missing attributes
TODO2: implement functions for all missing attributes

[BUG] Pricing not showing well after publishing a dataset using nautilus

Summary

Maybe not a bug.
Hi everyone.
When trying to publish a dataset with a specific price using nautilus, I noticed that the price showed later on the marketplace catalogue does not appear well.
For example, when given 1 Ocean, in the catalogue the price will be 1.001 Ocean. When I use the editPrice function, it seems to work fine.

Maybe it is an understanding problem my side.

Current Behavior

Price is 1.001 Ocean

Expected Behavior

1 Ocean

Steps to Reproduce

Here you can see my pricing config:

export const PRICING_CONFIGS: PricingConfig = {

  [Network.PONTUSX]: {

    FREE: {

      type: 'free'

    },

    FIXED_OCEAN: {

      type: 'fixed',


      freCreationParams: {

        fixedRateAddress: '0xAD8E7d2aFf5F5ae7c2645a52110851914eE6664b',

        baseTokenAddress: '0x0995527d3473b3a98c471f1ed8787acd77fbf009',

        baseTokenDecimals: 18,

        datatokenDecimals: 18,

        fixedRate: '1',

        marketFee: '0',

        marketFeeCollector: '0x0000000000000000000000000000000000000000'

      }

    },

    FIXED_EUROE: {

      type: 'fixed',

      freCreationParams: {

        fixedRateAddress: '0xAD8E7d2aFf5F5ae7c2645a52110851914eE6664b',

        baseTokenAddress: '0xe974c4894996E012399dEDbda0bE7314a73BBff1',

        baseTokenDecimals: 6, // adapted for EUROe decimals

        datatokenDecimals: 18,

        fixedRate: '1',

        marketFee: '0',

        marketFeeCollector: '0x0000000000000000000000000000000000000000'

      }

    }

  },

Maybe I should change something in the configuration.

Environment

  • OS: Windows 10 Entreprise
  • Node: v18.20.2
  • npm: 10.5.0

Anything else

[Bug] structuredClone is not defined

Summary

I am trying to edit an asset using nautilus but it fails with the error

structuredClone is not defined

structuredCloneError

Current Behavior

The AssetBuilder does not work fine I guess. The problem occurs at this step.
My code:

import { providers, Wallet } from 'ethers'
import { LogLevel, Nautilus, AssetBuilder } from '@deltadao/nautilus'
import { Request, Response} from 'express'
const provider = new providers.JsonRpcProvider('https://genx.sagresearch.de')
const wallet = new Wallet(process.env.PRIVATE_KEY, provider)
Nautilus.setLogLevel(LogLevel.Verbose)

const customConfig = {
  metadataCacheUri: 'https://aquarius510.v4.delta-dao.com'
}
export async function updateAsset(req: Request, res: Response) {
  const nautilus = await Nautilus.create(wallet, customConfig)

  const result=await nautilus.getAquariusAsset('did:op:ac7543c622f1550fe81fdf8b25b91149c9759e4e4ee5edad78a20631baec6afa')
  console.log(result)
  const assetBuilder = new AssetBuilder(result);
  console.log("assetBuilder", assetBuilder)
 // update asset and return statuscode
}

[BUG][0.3.0-alpha4] addConsumerParameter error

Nautilus Version: 0.3.0-alpha4

Example use:

    const consumerParameterBuilder = new ConsumerParameterBuilder()

    const param = consumerParameterBuilder
        .setType('number')
        .setName('myNumberParam')
        .setLabel('My Param Label')
        .setDescription('A description of my param for the enduser.')
        .setDefault('5')
        .setRequired(false)
        .build()

    const service = serviceBuilder
        .setServiceEndpoint(genx_config.providerUri) 
        .setTimeout(0) 
        .addFile(urlFile)
        .setPricing(pricingConfig)
        .setDatatokenNameAndSymbol(name, symbol)
        .addConsumerParameter(param)
        .build()

Error:

TypeError: Cannot read properties of undefined (reading 'push')

[BUG] setDatatokenNameAndSymbol should be part of the service builder

In the for every asset is every datatoken mapped to a specific service (multiple datatokens per asset are possible):

    "datatokens": [
      {
        "address": "0x6fD17dF555881cBA2c2686fB5F71Bb38f5aA132d",
        "name": "deltaDAO AlgoToken 001",
        "serviceId": "06ca5b39d4febb672046c42348935c957804d38c918b3e47c4b3fd8311e8b558",
        "symbol": "DD-AT-001"
      }
    ],
  • check if this is also wrong in ocean.js
  • move setDatatokenNameAndSymbol from assetBuilder to serviceBuilder

Error when using addTags(...)

When using addTags, I'm getting the following error message:

TypeError: Cannot read properties of undefined (reading 'concat')

It seems related to the fact that when doing it for the first time, the array of tags used as the first parameter is undefined at:

this.asset.metadata.tags = combineArrays(this.asset.metadata.tags, tags)

[BUG][0.3.0-alpha4] Publish error

Nautilus Version: 0.3.0-alpha4

Error:

ReferenceError: _interrupt is not defined

Does not work for any tested dataset AND algorithm.
Same problem with OCEAN token and EUROe
Tested FileType URL in combinations:

  1. FREE/ACCESS
  2. FREE/COMPUTE
  3. FIXED/ACCESS
  4. FIXED/COMPUTE

Code:

async function main() {
    Nautilus.setLogLevel(LogLevel.Verbose)
    const nautilus = await Nautilus.create(web3, genx_config )

    await publishDataset(nautilus)
}

async function publishDataset(nautilus) {
    const pricingConfig: PricingConfigWithoutOwner = {
        type: 'free'
    }
    // const pricingConfig: PricingConfigWithoutOwner = {
    //     type: 'fixed',
    //     freCreationParams: {
    //         fixedRateAddress: genx_config.fixedRateExchangeAddress, 
    //         baseTokenAddress: '0xe974c4894996E012399dEDbda0bE7314a73BBff1', 
    //         baseTokenDecimals: 6,
    //         datatokenDecimals: 18, 
    //         fixedRate: '1', 
    //         marketFee: '0', 
    //         marketFeeCollector: '0x0000000000000000000000000000000000000000' 
    //     }
    // }

    // Create a new 'access' type service serving 'url' files
    const serviceBuilder = new ServiceBuilder(ServiceTypes.ACCESS, FileTypes.URL)

    const urlFile: UrlFile = {
        type: 'url', 
        url: 'https://bitcoin.org/bitcoin.pdf', 
        method: 'GET'
    }
    const name = 'My Datatoken Name'
    const symbol = 'SYMBOL'

    const service = serviceBuilder
        .setServiceEndpoint(genx_config.providerUri) 
        .setTimeout(0) 
        .addFile(urlFile)
        .setPricing(pricingConfig)
        .setDatatokenNameAndSymbol(name, symbol)
        .build()

    const assetBuilder = new AssetBuilder()
    const asset = assetBuilder
        .setType('dataset') 
        .setName('My New Asset')
        .setDescription('A publish asset building test') 
        .setAuthor('testAuthor')
        .setLicense('MIT') 
        .addService(service)
        .setOwner(web3.defaultAccount)
        .build()


    const result = await nautilus.publish(asset)
    console.log(result)

}

[Enhancement] Define Service FileType only 1 time instead of 2 times

Motivation / Problem

Service FileType is defined as 2nd parameter on instance creation and in the urlFile object. This should be streamlined and only be defined once to prevent mistakes.

    const serviceBuilder = new ServiceBuilder(ServiceTypes.ACCESS, FileTypes.URL) 

    const urlFile: UrlFile = {
        type: 'url',
        url: 'https://bitcoin.org/bitcoin.pdf',
        method: 'GET'
    }

[Enhancement] Enable whitelisting of algorithms for compute datasets

Motivation / Problem

We currently don't provide functionality to whitelist given algorithms for datasets

Solution

  • allow and document configuration in publish flow for compute type datasets

Alternatives

  • implement this in an edit asset flow in addition to publish, that could allow this functionality as well (#10 )

[Enhancement] Start computation with two or more datasets

Motivation / Problem

For data integration scenarios, make it possible to start a computation that uses as input more than just one dataset. This seems possible when defining an algorithm, which receives an array of DIDs.

This way, it would be possible to start a computation job that combines multiple datasets.

Another interesting application is data transformations. Allowing two datasets as input, it would be possible to have generic mapping algorithms that receive as input the data to be mapped and the mapping to be used. This also enables a market of data transformation mappings, which are shared and monetised as any other kind of dataset.

Solution

When configuring a computation, it is possible to define multiple DIDs that will be passed to the algorithm as inputs.

[BUG] Not enough gas supplied when adding fourth TrustedAlgorithm to service

Summary

When editing a service and trying to add a fourth trusted algorithm to a compute dataset an error occurs.

Current Behavior

When trying to add a trusted algorithm to a service of a compute asset the following error occurs if already three algorithms have been defined:
Error: processing response error (body="{\"jsonrpc\":\"2.0\",\"id\":550,\"error\":{\"code\":-32600,\"message\":\"not enough gas supplied for intrinsic gas costs\"}}", error={"code":-32600}, requestBody="{\"method\":\"eth_estimateGas\",\"params\":[{\"from\":\"0x4a806a4851472f7cfd579d3ff5465f03c3c2b5d4\",\"to\":\"0xd275d24f99c102d6addf165df60f2d99df7fcfa0\",\"data\":\"0x1aa3adf900000[...]000000000\"}],\"id\":550,\"jsonrpc\":\"2.0\"}", requestMethod="POST", url="https://rpc.genx.minimal-gaia-x.eu", code=SERVER_ERROR, version=web/5.7.1)

Expected Behavior

The fourth algorithm should be added without error.

Steps to Reproduce

  1. With the config:
    chainId: 100, network: 'genx', metadataCacheUri: 'https://aquarius510.v4.delta-dao.com', nodeUri: 'https://rpc.genx.minimal-gaia-x.eu', providerUri: 'https://provider.v4.genx.delta-dao.com', subgraphUri: 'https://subgraph.v4.genx.minimal-gaia-x.eu', oceanTokenAddress: '0x0995527d3473b3a98c471f1ed8787acd77fbf009', oceanTokenSymbol: 'OCEAN', fixedRateExchangeAddress: '0xAD8E7d2aFf5F5ae7c2645a52110851914eE6664b', dispenserAddress: '0x94cb8FC8719Ed09bE3D9c696d2037EA95ef68d3e', nftFactoryAddress: '0x6cb85858183B82154921f68b434299EC4281da53', providerAddress: '0x68C24FA5b2319C81b34f248d1f928601D2E5246B',
  2. Use addTrustedAlgorithms function and call nautilus.edit()
  3. Tested with the following EuPG-algorithms:
    did:op:a2fd44a065902b9b84fc816e500dbf3cdcaf2a31a5b9cabf167c90025b896cf0
    did:op:6a6172d10f41fe25529b9fb8204952890cb702040b1bbe2b6e58667bce30ac42
    did:op:bd74d6a281ba414de2b4d8ee4087277575f95676bd74e20ee9e2960c9c38d7c5
    did:op:a3da777fd3711da36d5e1e5904a8c074b6e8df51549db2b6c8a5bc7ec3ab60cf
  4. See error

Environment

  • nautilus 1.0.2
  • npm 9.8.1

Anything else

@moritzkirstein

[Enhancement] Support "SaaS" as asset type

Motivation / Problem

Support "SaaS" as additional asset type. An example can already be found here: https://cooperants.pontus-x.eu/asset/did:op:6fe6ae5f546adf88606a0da158389f44a0801c3fe87ddb6e1f63e9120e07989c but I am not sure how it was created (Portal and Nautilus do not support SaaS as type ... maybe using 'ethers' directly).

The main difference between "algorithm" and "saas" will be the lifetime of the service instance. The "algorithm" will have a finite lifespan, while "saas" might be infinite (based on the contract).

Solution

Extend Metadata::type: 'dataset' | 'algorithm' to Metadata::type: 'dataset' | 'algorithm' | 'saas'.

Possible inconsistencies

The "algorithm" asset takes a MetadataAlgorithm as configuration, which could be 'abused' for "saas" but might be misleading because of the naming. Maybe rename to Metadata(Runtime)Environment.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.