Comments (39)
- Define struct of request/query on api
- Dump of databases
- Build a base api
from das-poc.
Aug 9, 2023
Yesterday, Terraform code was implemented for provisioning resources on Vultr, just testing because I still don't have the access credentials, I followed the provisioning that is temporarily being done manually throughout the day, I wrote a pyhton script to connect to the banks of redis and Mongodb data for connection tests and data capture, architected some adaptations in the application design and estimated AWS costs with servless databases for Redis and MongoDB.
from das-poc.
mongodb - AWS Pricing Calculator.pdf
My Estimate - AWS Pricing Calculator.pdf
from das-poc.
from das-poc.
Aug 9, 2023
Vultr provider documentation for terraform: https://registry.terraform.io/providers/vultr/vultr/latest/docs
from das-poc.
Aug 9, 2023
MongoDB and Redis response time for query with 100k records:
from das-poc.
Aug 9, 2023
Repositories created for testing:
https://github.com/levisingularity/DAS-infra-stack-servless
https://github.com/levisingularity/DAS-function
https://github.com/levisingularity/DAS-infra-stack-aws
from das-poc.
Aug 10, 2023
Between yesterday and today I worked on top of the Vultr servers, creating them dynamically with terrafor (see project https://github.com/levisingularity/DAS-infra-stack-vultr), I created a structure to also create the lambda function resources in the aws (see project: https://github.com/levisingularity/DAS-infra-stack-aws).
from das-poc.
Aug 10, 2023
I was able to create a lambda function, use s3 to store the source code zip (https://s3.console.aws.amazon.com/s3/buckets/das.singularitynet.io?region=us-east-1&prefix=production /&showversions=false) and run the test code in the aws lambda:
from das-poc.
Aug 10, 2023
Terraform code running on my machine:
from das-poc.
Aug 10, 2023
test script response time on Vultr's barel metal server:
from das-poc.
Aug 11, 2023
Today I managed to set up a server with OpenFAAS simulating a python function running servless, we can already create as many functions as we want, and invoke these functions:
from das-poc.
aug 12, 2023:
Detailed architecture drawing:
from das-poc.
Aug 15, 2023:
Yesterday I was able to integrate openFaas with aws ECR for the docker images registry, now we can build and push the function images in OpenFaas using the AWS ECR registry
from das-poc.
aug 15, 2023:
Yesterday I was able to integrate openFaas with aws ECR for the docker images registry, now we can build and push the function images in OpenFaas using the AWS ECR registry
from das-poc.
aug 16, 2023:
Here is the last evolution made yesterday of the architecture:
from das-poc.
singnet/das-infra-stack-vultr#1
singnet/das-pre-infra-aws#1
singnet/das-pre-infra-vultr#1
singnet/das-infra-stack-aws#1
from das-poc.
aug 16, 2023: Yesterday the terraform script for API Gateway was already created and today I already configured and uploaded the prs to the new repositories
from das-poc.
aug 17, 2023:
Today I managed to make available in aws, an instance with openfaas, already with the test functions deployed.
If you want to test it, follow the link and accesses:
http://3.92.235.170:8080/ui/
user: admin
pass: 66U6dPbkQgBtDTIfIhZSK03Ox9BVQLV6BOlj5sSCdCAaMXTBN0aMk5IXlnaikfq
from das-poc.
from das-poc.
Builted the pipeline script to validate the terraform script on github actions:
https://github.com/singnet/das-infra-stack-aws/actions/runs/5897267840
Today, i will apply this pipeline to all Das infra projects and add the step to release and tag create following the semantic versioning
from das-poc.
https://github.com/singnet/das-scripts-pipeline/pull/1
from das-poc.
I managed to make the lambda connect with redis and mongodb and insert the data in mongo and redis, but when I make the call to redis it gives an error related to the slot, I tried to use rediscluster but it does not want to work, I will try to use now the redis-py-cluster to see if it will work
from das-poc.
from das-poc.
I managed to complete the integration, I just couldn't find the reason why redis was taking longer than mongodb to respond... I'm looking at how to pass the pipeline code from gitlab to github about the semantic versioning
from das-poc.
List of natively supported technologies for creating functions in openfaas:
https://github.com/openfaas/templates
in AWS Lambda Functions:
https://docs.aws.amazon.com/lambda/latest/dg/lambda-runtimes.html?icmpid=docs_lambda_help
from das-poc.
Created docker images on DockerHub registry:
from das-poc.
Created documentation of DAS system infrastructure architecture:
https://docs.google.com/document/d/1kQhM62T3TIb3ECoqBxmqPulMciJq7Vev3-f_Gnhes_s/edit
from das-poc.
Created repository on Github to pipelines scripts and created pipeline to generate the Tag of release for all projects: https://github.com/singnet/das-scripts-pipeline
from das-poc.
configuring openfaas to upload the functions and test the connection with redis and mongodb:
from das-poc.
I made 3 functions in Vultr's Openfaas, connected to the Redis and MongoDB databases:
- One to connect to redis and bring 10 thousand random nodes from das, taking from redis
- One to connect to redis and bring 10 thousand random nodes from das, taking from Mongo
- Another one that is an adaptation of Das for function
You can access and see the functions within Openfaas by accessing the link below:
http://149.28.222.79:8080/ui/
user: admin
Ask the password
To see the code for each one, just look at the repos:
- https://github.com/levisingularity/DAS-function/tree/main/list-data-mongo (only takes 10 thousand nodes from the MongoDB database)
- https://github.com/levisingularity/DAS-function/tree/main/list-data-redis (only takes 10 thousand nodes in the Redis database)
- https://github.com/levisingularity/das/tree/feature/openfaas (adaptation of DAS to run on OpenFaas)
You can see that the adaptations I made to DAS were few. Basically I built a file to deploy the function in Openfaas (das-function.yml), changed the name of the main function, from main to handler (service/server.py → handler.py), changed the reference of this main file in service /client.py and removed the call via grpc to leave it in the Openfaas HTTP standard (provisionally, because it was simpler to leave HTTP)
from das-poc.
Working list redis and mongoDB scripts on Vultr infra:
from das-poc.
Instructions for testing DAS already working in OpenFaas:
1- http://149.28.222.79:8080/ui/
2- Log in with your credentials:
user: admin
password: Ask the Password
3- Use the DAS function (image attached)
body example:
{ "action":"get_node", "database_name": "das", "node_type": "node-example", "node_name": "node-name" }
from das-poc.
The endpoint query is already completed but I'm still trying to resolve the timeout problem
The following json is an example of a query:
{
"action": "query",
"database_name": "das",
"query": {
"And": [
{
"Link": {
"link_type": "Evaluation",
"ordered": true,
"targets": [
{
"Variable": {
"variable_name": "000000214999369af91fb563b4e0eadb"
}
},
{
"Variable": {
"variable_name": "1a0738cb1a6b6b8ce7bae84c4296c0ce"
}
}
]
}
}
]
}
}
from das-poc.
I've resolved the timeout problem, and now the query function is working properly along side with the other ones
from das-poc.
I'm creating the pipelines to deploy the functions to AWS
from das-poc.
I've configured the DAS client to call AWS functions, and the database is still being populated on AWS. Here's the log as of now:
2023-10-17 11:34:14 INFO Parsed 134205645/141269154 (95%)
2023-10-17 11:35:36 INFO Parsed 135618336/141269154 (96%)
2023-10-17 11:37:03 INFO Parsed 137031027/141269154 (97%)
2023-10-17 11:38:24 INFO Parsed 138443718/141269154 (98%)
2023-10-17 11:40:01 INFO Parsed 139856409/141269154 (99%)
2023-10-17 11:42:13 INFO Parsed 141269100/141269154 (100%)
2023-10-17 11:42:13 INFO Finished parsing file.
2023-10-17 11:42:13 INFO Populating MongoDB link tables
2023-10-17 13:46:59 INFO Populating Redis
2023-10-17 13:46:59 INFO Building key-value file
I'm setting up the versioning pipeline in DAS.
DasClient being called from the terminal
2023-10-17.22-24-15.mp4
from das-poc.
I fixed the function call problem in AWS, configuring a new machine to run the script to populate the database.
from das-poc.
About my update, i doing:
- Run functions perfectly in AWS
- Make the implementation of the client class run perfectly in both openfaas/vultr and aws
- Upload the Mongo and Redis database to AWS (Always monitor memory, HD, processing and logs)
- Run load tests and performance tests (measure delay and response time for each method) - generate reports of these tests
- Document well where each repo, code, classes, files and examples of how to use it are located (create doc on Google Drive)
tool to performance tests: https://github.com/wg/wrk
documentation in progress: https://docs.google.com/document/d/1njmP_oXw_0FLwoXY5ttGBMFGV2n60-ugAltWIuoQO10/edit?usp=sharing
from das-poc.
Related Issues (20)
- Add node type cache
- Add get_node_name()
- Create script to fetch FlyBase knowledge base
- New SQL parser step: create references for FKs related to relevant rows
- Change Evaluation -> Execution in SQL mapping
- PKEY nodes are being built but not being added
- Another Test
- Provide a DAS with FlyBase data in Vultr to test architecture options
- Build instructions not relying on docker
- Add feature to re-write a link based on assignment
- Temp name file is being overwritten when canonical loading multiple files
- Ensure the operation of the functions on AWS HOT 1
- Run load tests and performance tests (measure delay and response time for each method) and generate a report of the test results. HOT 2
- Deploy the Mongo and Redis databases on AWS, and continuously monitor memory, disk space, processing, and logs HOT 2
- Ensure the execution of the client class implementation on both OpenFaaS/Vultr and AWS. HOT 1
- Create infrastructure environments using Terraform workspaces HOT 2
- create pipeline to create an instance in aws and populate the database
- load new database to aws HOT 2
- Migration and finalization of DAS deployment architecture document HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from das-poc.