Git Product home page Git Product logo

orangecoding / fredy Goto Github PK

View Code? Open in Web Editor NEW
205.0 4.0 53.0 4.3 MB

:heart: Fredy - [F]ind [R]eal [E]states [D]amn Eas[y] - Fredy will constantly search for new listings on sites like Immoscout or Immowelt and send new results to you, so that you can focus on more important things in life ;)

Home Page: http://www.orange-coding.net

License: MIT License

JavaScript 90.01% Dockerfile 0.15% HTML 0.24% Handlebars 7.86% Less 1.75%
fredy flat finder germany apartment realrestate sendgrid telegram slack mailjet notification immoscout immowelt scraper

fredy's Introduction

Build Status

Searching an apartment in Germany can be a frustrating task. Not any longer though, as Fredy will take over and will only notify you once new listings have been found that match your requirements.

Fredy scrapes multiple services (Immonet, Immowelt etc.) and send new listings to you once they become available. The list of available services can easily be extended. For your convenience, Fredy has a UI to help you configure your search jobs.

If Fredy finds matching results, it will send them to you via Slack, Email, Telegram etc. (More adapters can be configured.) As Fredy stores the listings it has found, new results will not be sent to you twice (and as a side-effect, Fredy can show some statistics). Furthermore, Fredy checks duplicates per scraping so that the same listings are not being sent twice or more when posted on various platforms (which happens more often than one might think).

Sponsorship

If you like my work, consider becoming a sponsor. I'm not expecting anybody to pay for Fredy or any other Open Source Project I'm maintaining, however keep in mind, I'm doing all of this in my spare time :) Thanks.

Fredy is supported by JetBrains under Open Source Support Program

Usage

  • Make sure to use Node.js 18 or above
  • Run the following commands:
yarn (or npm install)
yarn run prod
yarn run start

Fredy will start with the default port, set to 9998. You can access Fredy by opening your browser at http://localhost:9998. The default login is admin, both for username and password. You should change the password as soon as possible when you plan to run Fredy on a server.

Job Configuration         Job Analytics         Job Overview

Understanding the fundamentals

There are 3 important parts in Fredy, that you need to understand to leverage the full power of Fredy.

Provider

Fredy supports multiple services. Immonet, Immowelt and Ebay are just a few examples. Those services are called providers within Fredy. When creating a new job, you can choose one or more providers. A provider contains the URL that points to the search results for the respective service. If you go to immonet.de and search for something, the displayed URL in the browser is what the provider needs to do its magic. It is important that you order the search results by date, so that Fredy always picks the latest results first!

Adapter

Fredy supports multiple adapters, such as Slack, SendGrid, Telegram etc. A search job can have as many adapters as supported by Fredy. Each adapter needs different configuration values, which you have to provide when using them. A adapter dictactes how the frontend renders by telling the frontend what information it needs in order to send listings to the user.

Jobs

A Job wraps adapters and providers. Fredy runs the configured jobs in a specific interval (can be configured in /conf/config.json).

Creating your first job

To create your first job, click on the button "Create New Job" on the job table. The job creation dialog should be self-explanatory, however there is one important thing. When configuring providers, before copying the URL from your browser, make sure that you have sorted the results by date to make sure Fredy always picks the latest results first.

User management

As an administrator, you can create, edit and remove users from Fredy. Be careful, each job is connected to the user that has created the job. If you remove the user, their jobs will also be removed.

Development

Running Fredy in development mode

To run Fredy in development mode, you need to run the backend & frontend separately. Start the backend with:

yarn run start

For the frontend, run:

yarn run dev

You should now be able to access Fredy from your browser. Check your Terminal to see what port the frontend is running on.

Running Tests

To run the tests, run

yarn run test

Architecture

Architecture

Immoscout / Immonet

I have added experimental support for Immoscout and Immonet. They both are somewhat special, because they have decided to secure their service from bots using Re-Capture. Finding a way around this is barely possible. For Fredy to be able to bypass this check, I'm using a service called ScrapingAnt. The trick is to use a headless browser, rotating proxies and (once successfully validated) to re-send the cookies each time.

To be able to use Immoscout / Immonet, you need to create an account at ScrapingAnt. Configure the API key in the "General Settings" tab (visible when logged in as administrator). The rest will be handled by Fredy. Keep in mind, the support is experimental. There might be bugs and you might not always pass the re-capture check, but most of the time it works rather well :)

If you need more than the 1000 API calls allowed per month, I'd suggest opting for a paid account... ScrapingAnt loves OpenSource, therefore they have decided to give all Fredy users a 10% discount by using the code FREDY10 (Disclaimer: I do not earn any money for recommending their service).

👐 Contributing

Thanks to all the people who already contributed!

See Contributing

Docker

Use the Dockerfile in this repository to build an image.

Example: docker build -t fredy/fredy /path/to/your/Dockerfile

Or use docker-compose:

Example docker-compose build

Or use the container that will be built automatically.

docker pull ghcr.io/orangecoding/fredy:master

Create & run a container

Put your config.json into a path of your choice, such as /path/to/your/conf/.

Example: docker create --name fredy -v /path/to/your/conf/:/conf -p 9998:9998 fredy/fredy

Logs

You can browse the logs with docker logs fredy -f.

fredy's People

Contributors

ah-it-sltn avatar ali-sharafi avatar anbucher avatar bergo avatar carlambroselli avatar dalins avatar janekbettinger avatar joschi avatar jstnw10 avatar mefarazath avatar noctarius avatar orangecoding avatar pomeloy avatar sebastianwilczek avatar sven-simonsen avatar vanhekthor avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

fredy's Issues

Handle scrapingant API errors instead of terminating program

Unfortunately my tmux session only contains 1781 lines and it starts with json already, so I can't really tell where or why it started doing this. Here are the latest lines. Seems like there was an API issue on Scrapingant and fredy didn't catch the error perhaps?:

            'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.85 Safari/537.36',            useQueryString: true,                                                                                                                                                                               [0/1781]
            'x-api-key': 'REDACTED',                                                                                                                                                            
            Cookie: 'reese84=3:N701nMUwP7cR0e1tzccZxw==:R5PmKGR1/HaiU5lk66aH3WMIjVdOfLqeXx/boVS7yFtuJA+SLJhhbgedGHmh2w8p0AuCGypwoO7o6AkdVF3nJdeUcCuYtSF+TFzrkLBfZto48550OX6lJLE/rGg1eTb3NgMj5VKnRDb86V92AZVFW7ypoyKx4yheokwBnEfqcFZfrIrPlTQa6Y2JpkLPvZPx6bgrMrXXoWqf6735LQYJ4u3ZtzK6gMAEsvEFSOxU66ztVo6QQMNAUFIrNMiO9YnXIw1kHZu37z9ty/9pQQBDMeOSvhFcQEI5jXtN/DWiBWqg5/TP4qlpwnKbfbxofOiYlf7+kjfs3fobUgrldFgycncXw56DQLx/zRpeO7xnLTlOEbFJ6tqvd2crIxwTPpY3wmERB6P+NDWXNQjswxHjjDmc5pIrTb4LtOJTzJqjHdkGOtUT5RQjPQz/vZT36bwk6ljOrpK4gK/VL+9Oti9yHQ==:kW7yWLLhKwaAx7FDY5OmRCaFpKSiw26NvaHWqUDHsyk='                                                                          
          },                                                                                                
          agent: undefined,                                                                                 
          agents: { http: undefined, https: undefined },                                                                                                                                                                
          auth: undefined,                                                                                                                                                                                              
          hostname: 'api.scrapingant.com',                                                                                                                                                                              
          port: null,                                                                                                                                                                                                   
          nativeProtocols: { 'http:': [Object], 'https:': [Object] },                                                                                                                                                   
          pathname: '/v1/general',                                                                                                                                                                                      
          search: '?url=https%3A%2F%2Fwww.immobilienscout24.de%2FSuche%2Fradius%2Fwohnung-mieten%3Fcenterofsearchaddress%3DHeidelberg%3B%3B%3B%3B%3BBahnstadt%26numberofrooms%3D3.0-%26pricetype%3Drentpermonth%26geocoordinates%3D49.40472%3B8.66334%3B4.0%26sorting%3D2%26enteredFrom%3Dresult_list&proxy_country=il'                                                                                                                         
        },                                                                                                                                                                                                              
        _ended: true,                                                                                                                                                                                                   
        _ending: true,                                                                                                                                                                                                  
        _redirectCount: 0,                                                                                                                                                                                              
        _redirects: [],                                                                                     
        _requestBodyLength: 0,                                                                                                                                                                                          
        _requestBodyBuffers: [],                                                                                                                                                                                        
        _onNativeResponse: [Function (anonymous)],                                                                                                                                                                      
        _currentRequest: [Circular *2],                                                                                                                                                                                 
        _currentUrl: 'https://api.scrapingant.com/v1/general?url=https%3A%2F%2Fwww.immobilienscout24.de%2FSuche%2Fradius%2Fwohnung-mieten%3Fcenterofsearchaddress%3DHeidelberg%3B%3B%3B%3B%3BBahnstadt%26numberofrooms%3D3.0-%26pricetype%3Drentpermonth%26geocoordinates%3D49.40472%3B8.66334%3B4.0%26sorting%3D2%26enteredFrom%3Dresult_list&proxy_country=il',                                                                               
        [Symbol(kCapture)]: false                                                                                                                                                                                       
      },                                                                                                                                                                                                                
      [Symbol(kCapture)]: false,                                                                                                                                                                                        
      [Symbol(kNeedDrain)]: false,                                                                                                                                                                                      
      [Symbol(corked)]: 0,                                                                                                                                                                                              
      [Symbol(kOutHeaders)]: [Object: null prototype] {                                                                                                                                                                 
        accept: [ 'Accept', 'application/json, text/plain, */*' ],                                                                                                                                                      
        'user-agent': [                                                                                     
          'User-Agent',                                                                                                                                                                                                 
          'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.85 Safari/537.36'                                                                                           
        ],                                                                                                                                                                                                              
        usequerystring: [ 'useQueryString', true ],                                                                                                                                                                     
        'x-api-key': [ 'x-api-key', 'REDACTED' ],                                                                                                                                               
        cookie: [                                                                                                                                                                                                       
          'Cookie',                                                                                                                                                                                                     
          'reese84=3:N701nMUwP7cR0e1tzccZxw==:R5PmKGR1/HaiU5lk66aH3WMIjVdOfLqeXx/boVS7yFtuJA+SLJhhbgedGHmh2w8p0AuCGypwoO7o6AkdVF3nJdeUcCuYtSF+TFzrkLBfZto48550OX6lJLE/rGg1eTb3NgMj5VKnRDb86V92AZVFW7ypoyKx4yheokwBnEfqcFZfrIrPlTQa6Y2JpkLPvZPx6bgrMrXXoWqf6735LQYJ4u3ZtzK6gMAEsvEFSOxU66ztVo6QQMNAUFIrNMiO9YnXIw1kHZu37z9ty/9pQQBDMeOSvhFcQEI5jXtN/DWiBWqg5/TP4qlpwnKbfbxofOiYlf7+kjfs3fobUgrldFgycncXw56DQLx/zRpeO7xnLTlOEbFJ6tqvd2crIxwTPpY3wmERB6P+NDWXNQjswxHjjDmc5pIrTb4LtOJTzJqjHdkGOtUT5RQjPQz/vZT36bwk6ljOrpK4gK/VL+9Oti9yHQ==:kW7yWLLhKwaAx7FDY5OmRCaFpKSiw26NvaHWqUDHsyk='                                                                                    
        ],                                                                                                                                                                                                              
        host: [ 'Host', 'api.scrapingant.com' ]                                                                                                                                                                         
      }                                                                                                                                                                                                                 
    },                                                                                                      
    data: {                                                                                                 
      detail: 'Internal server error. Try again later or contact us [email protected]'                                                                                                                            
    }                                                                                                                                                                                                                   
  },                                                                                                                                                                                                                    
  isAxiosError: true,                                                                                                                                                                                                   
  toJSON: [Function: toJSON]                                                                                                                                                                                            
}                             



                                                                                                                                                                                          
error Command failed with exit code 1.                                                                                                                                                                                  
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command

The only modification I did to fredy was add some debug statements to lib/notification/adapter/telegram.js:

exports.send = ({ serviceName, newListings, notificationConfig, jobKey }) => {
  const { token, chatId } = notificationConfig.find((adapter) => adapter.id === 'telegram').fields;

  const bot = new TelegramBot(token);

  let message = `Job: ${jobKey} | Service _${serviceName}_ found _${newListings.length}_ new listings:\n\n`;

  message += newListings.map(
    (o) =>
      `*${shorten(o.title.replace(/\*/g, ''), 45)}*\n` +
      [o.address, o.price, o.size].join(' | ') +
      '\n' +
      `[LINK](${o.link})\n\n`
  );

  console.log(chatId);
  console.log(message);
  console.log(opts);

  return bot.sendMessage(chatId, message, opts);
};

Define more than one mail recepient with MailJet and SendGrid

I would like to add more than one mail recipient and tried the following separators between them:

  • space
  • comma
  • semicolon

None of them worked and if you have created one notification adapter of type A, Fredy doesn't allow you to create another one of the same type.

Uncaught exception with ImmoScout/ScrapingAnt

Hi,

yesterday I tried the ImmoScout provider for the first time. At least once, scraping/retrieval worked fine and yielded results. After a few hours, though, Fredy crashed with the following error:

node:internal/process/promises:246                                                 
          triggerUncaughtException(err, true /* fromPromise */);                   
          ^                                                                        
                                                                                   
Error: Request failed with status code 404                                         
    at createError (/usr/home/.../fredy/node_modules/axios/lib/core/createError.js:16:15)                                                                    
    at settle (/usr/home/.../fredy/node_modules/axios/lib/core/settle.js:17:12)                                                                              
    at IncomingMessage.handleStreamEnd (/usr/home/.../fredy/node_modules/axios/lib/adapters/http.js:293:11)                                                  
    at IncomingMessage.emit (node:events:402:35)                                   
    at endReadableNT (node:internal/streams/readable:1340:12)                      
    at processTicksAndRejections (node:internal/process/task_queues:83:21) {   
[...]

Does this need to be caught somewhere or am I doing something wrong?

Another issue I faced is:

TypeError: Cannot read properties of undefined (reading 'substring')                                                                                                  
    at normalize (/usr/home/.../fredy/lib/provider/immoscout.js:8:58)                                                                                        
    at Array.map (<anonymous>)  

but this one could be easily fixed by checking if o.link is defined and setting it to empty if not. Apparently some ImmoScout entries do not have a link or the parsing goes wrong.

Setting Interval and Working Hours job-wise or provider-wise

It would be awesome to set the interval and working hours specific for every job or even every provider. This would allow to use the free 10000 API Credits of ScrapingAnt most efficiently, while crawling all other sites on a higher interval.

Windows does not understand "export"

Describe the bug
When running yarn run prod on a windows machine the command export BUILD_DEV='false' in package.json fails with 'export' is not recognized as an internal or external command.

Expected behavior
Replacing all instances of export with set if running on a windows machine fixes the issue. I don't know how to do that in a dynamic way though.

UI text misleading / not updating according to configuration

Thank you for this! Amazing and beautiful work. I though I`d share my minor findings in the UI / UX. Nothing major..

Is your feature request related to a problem? Please describe.
This is not related to a problem.

Describe the solution you'd like
Some UI texts are misleading regarding credit usage of ScrapingAnt. The credit usage text should reflect what is selected in the configuration.

Describe alternatives you've considered
None

Additional context
Below it should be residential
image

In the job overview, the current usage of credits is shown. It should reflect the configuration (datacenter) and display 10 credits/ call instead of 250.
image

I also checked ScrapingAnt website to check why premium is 250 credits. I could not find it. Should it rather be 125 ?
image

Make the user aware of ongoing processes or waiting times

I've set up Fredy now in an Immoscout + Telegram combo, but I do not know what it is doing. Is it waiting for the next scan, did it already scan something, or did something go wrong in the Telegram setup?

Would be good to have some indicators for these things :)

firefox_2hj49GRmEK

Only one notification is sent with duplicated adapters

I have the use case that I want to send two emails to two different recipients within a single job.

Therefore, I added a second Mailjet Notification Adapter within the same job with the second recipient address:
grafik

When new alerts show up, only a single notification within the same adapter type is sent and the other adapter ignored. Other adapters (here Console) are successfully triggered.

Immobilienscout is blocking ScrapingAnt now

I've realized that fredy hasn't been posting any Immobilienscout postings as of lately. Apparently they upped their robots detection game during June: when trying to do a request call through the ScrapingAnt dashboard, the result notes that the request was identified as a robot and therefore blocked.

[Solved] Access Denied when running with Ubuntu - docker-compose

First of all thanks for the amazing project.

Describe the bug
I launched the project on Ubuntu and docker- compose, when when trying to login with default admin/admin login I got the following error:


$ docker-compose up
Starting fredy ... done
Attaching to fredy
fredy    | Started Fredy successfully. Ui can be accessed via http://localhost:9998
fredy    | Started API service on port 9998
fredy    | node:fs:590
fredy    |   handleErrorFromBinding(ctx);
fredy    |   ^
fredy    | 
fredy    | Error: EACCES: permission denied, open '/fredy/db/.users.json.tmp'
fredy    |     at Object.openSync (node:fs:590:3)
fredy    |     at Object.writeFileSync (node:fs:2202:35)
fredy    |     at TextFileSync.write (file:///fredy/node_modules/lowdb/lib/adapters/node/TextFile.js:63:12)
fredy    |     at JSONFileSync.write (file:///fredy/node_modules/lowdb/lib/adapters/node/JSONFile.js:48:66)
fredy    |     at LowdashAdapter.write (file:///fredy/node_modules/lowdb/lib/core/Low.js:60:26)
fredy    |     at Module.setLastLoginToNow (file:///fredy/lib/services/storage/userStorage.js:75:6)
fredy    |     at file:///fredy/lib/api/routes/loginRoute.js:28:17
fredy    |     at next (/fredy/node_modules/0http/lib/next.js:37:14)
fredy    |     at Trouter.router.lookup (/fredy/node_modules/0http/lib/router/sequential.js:93:14)
fredy    |     at Object.service.lookup (/fredy/node_modules/restana/libs/request-router.js:75:49)
fredy    |     at next (/fredy/node_modules/0http/lib/next.js:35:25)
fredy    |     at step (/fredy/node_modules/0http/lib/next.js:17:14)
fredy    |     at serveStatic (/fredy/node_modules/serve-static/index.js:75:16)
fredy    |     at next (/fredy/node_modules/0http/lib/next.js:37:14)
fredy    |     at step (/fredy/node_modules/0http/lib/next.js:17:14)
fredy    |     at _cookieSession (/fredy/node_modules/cookie-session/index.js:135:5) {
fredy    |   errno: -13,
fredy    |   syscall: 'open',
fredy    |   code: 'EACCES',
fredy    |   path: '/fredy/db/.users.json.tmp'
fredy    | }
fredy exited with code 1



I solved using:

sudo chmod 777 -R db/

Feature: Limit fredy specific hours for deployment on a server

Hi!

Great project you got here. It really took my flat searching to the next level.

So, about this feature. My thinking was: if I only have 1k requests on Immoscout per month (because 20€ per month for 10k isn't exactly a bargain), it would be ideal to only scrape Immoscout when I was able to respond anyway. Obviously it's useful for a lot more applications, like reducing Fredy's spam during times you can't answer anyway and so on.

The best thing: I already implemented it over at my fork.
The not so best thing: I don't really have the time for thorough testing and stuff, so it's probably pretty rough around stuff.

Things to improve I've noticed:

  • validation: currently it's possible to have begin time be after end time. This leads to never-working hours. Someone would have to either throw an error for this or consider that the end time might be the next day.
  • Similar issue with midnight, as the date pickers display that as 0:00.
  • design: I've just thrown two inputs in there, they don't really fit the theme.
  • editing: currently you can't, you have to delete and reenter. I was a bit lazy there.
  • actual testing: I mean, it works, but, you know how it is.

I realize you might have some financial stakes in actually getting people to buy ScrapingAnt memberships, so I'm not mad if this doesn't interest you.

If you're interested, hit me up with how I can help you transition the changes to your repo. Since it's not quite finished it would probably be ideal to set up a PR to a feature-branch on your side.

Error when clicking on general settings

Hello, I'm using Fredy for the first time. I noticed that when I click on general settings on the front end nothing shows and I get this error on the front page:

Uncaught Error: Minified React error #185; visit https://reactjs.org/docs/error-decoder.html?invariant=185 for the full message or use the non-minified dev environment for full errors and additional helpful warnings.

Allow multiple instances of one provider

Kleinanzeigen does not allow filtering searches by multiple city districts, while Fredy only permits one search instance for each provider.
Thus, multiple distinct jobs have to be set up for any city where filtering by district might be required.

Users should either be able to add multiple instances of one provider (in this case Kleinanzeigen) or be able to pass multiple search request URLs to the Kleinanzeigen provider.

Notification trigger issue

Hi and thank you for your efforts into this nice project!

I installed the docker container but I cannot set up any notifications (besides Console). I tried MailJet and Telegram but get a http timeout (no error message after all) after clicking on Try and an error shows up in the logs.

Mailjet log error:
fredy | (node:1) UnhandledPromiseRejectionWarning: TypeError: Cannot destructure property 'apiPublicKey' of 'notificationConfig.find(...).fields' as it is undefined.
fredy | at Object.exports.send (/usr/src/fredy/lib/notification/adapter/mailJet.js:20:11)
fredy | at /usr/src/fredy/lib/api/routes/notificationAdapterRouter.js:28:13
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at Trouter.router.lookup (/usr/src/fredy/node_modules/0http/lib/router/sequential.js:93:14)
fredy | at Object.service.lookup (/usr/src/fredy/node_modules/restana/libs/request-router.js:62:49)
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:30:25)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | at /usr/src/fredy/lib/api/security.js:26:7
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | at serveStatic (/usr/src/fredy/node_modules/serve-static/index.js:75:16)
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | at _cookieSession (/usr/src/fredy/node_modules/cookie-session/index.js:126:5)
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | (node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag --unhandled-rejections=strict (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 3)

And for telegram:
fredy | (node:1) UnhandledPromiseRejectionWarning: TypeError: Cannot destructure property 'token' of 'notificationConfig.find(...).fields' as it is undefined.
fredy | at Object.exports.send (/usr/src/fredy/lib/notification/adapter/telegram.js:14:11)
fredy | at /usr/src/fredy/lib/api/routes/notificationAdapterRouter.js:28:13
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at Trouter.router.lookup (/usr/src/fredy/node_modules/0http/lib/router/sequential.js:93:14)
fredy | at Object.service.lookup (/usr/src/fredy/node_modules/restana/libs/request-router.js:62:49)
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:30:25)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | at /usr/src/fredy/lib/api/security.js:26:7
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | at serveStatic (/usr/src/fredy/node_modules/serve-static/index.js:75:16)
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | at _cookieSession (/usr/src/fredy/node_modules/cookie-session/index.js:126:5)
fredy | at next (/usr/src/fredy/node_modules/0http/lib/next.js:32:14)
fredy | at step (/usr/src/fredy/node_modules/0http/lib/next.js:15:14)
fredy | (node:1) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag --unhandled-rejections=strict (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 4)

Immoscout with ScrapingAnt: Try a data-center request first before using residential proxies

Hi,

first of all, thank you for your amazing work. I have a small feature request: would it be possible when scraping immoscout with the help of ScrapingAnt to try to use a data-center proxy call before using residential proxies? The current call consumes 250 credits, which means even with paying a monthly plan of 19$, only 400 calls would be allowed per month. Considering the very short reaction times that are needed to find a flat (for popular cities emails should be sent within 5 minutes) roughly 13 calls per day are just not enough. By using data-center proxies there would be chance of making 25x more calls. Otherwise, with the current consumption of credits, using fredy for immoscout crawling would not help a lot, unless they are able to spend 250$ a month for the service of ScrapingAnt.

Best,
Tom

Crash after login (Docker): "Error: EACCES: permission denied, open '/fredy/db/.users.json.tmp'"

Hi,

I cloned the repo with git clone and started the build with docker-compose build fredy.
I get the container up and running, but when i try to login with admin/admin in the browser the container crashes with following message:

Creating network "fredy_default" with the default driver
Creating fredy ... done
Attaching to fredy
fredy    | (node:1) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
fredy    | (Use `node --trace-deprecation ...` to show where the warning was created)
fredy    | Started Fredy successfully. Ui can be accessed via http://localhost:9998
fredy    | Started API service on port 9998
fredy    | node:fs:590
fredy    |   handleErrorFromBinding(ctx);
fredy    |   ^
fredy    |
fredy    | Error: EACCES: permission denied, open '/fredy/db/.users.json.tmp'
fredy    |     at Object.openSync (node:fs:590:3)
fredy    |     at Object.writeFileSync (node:fs:2202:35)
fredy    |     at TextFileSync.write (file:///fredy/node_modules/lowdb/lib/adapters/TextFileSync.js:36:12)
fredy    |     at JSONFileSync.write (file:///fredy/node_modules/lowdb/lib/adapters/JSONFileSync.js:29:66)
fredy    |     at LowdashAdapter.write (file:///fredy/node_modules/lowdb/lib/LowSync.js:28:26)
fredy    |     at Module.setLastLoginToNow (file:///fredy/lib/services/storage/userStorage.js:75:6)
fredy    |     at file:///fredy/lib/api/routes/loginRoute.js:28:17
fredy    |     at next (/fredy/node_modules/0http/lib/next.js:35:14)
fredy    |     at Trouter.router.lookup (/fredy/node_modules/0http/lib/router/sequential.js:93:14)
fredy    |     at Object.service.lookup (/fredy/node_modules/restana/libs/request-router.js:75:49)
fredy    |     at next (/fredy/node_modules/0http/lib/next.js:33:25)
fredy    |     at step (/fredy/node_modules/0http/lib/next.js:15:14)
fredy    |     at serveStatic (/fredy/node_modules/serve-static/index.js:75:16)
fredy    |     at next (/fredy/node_modules/0http/lib/next.js:35:14)
fredy    |     at step (/fredy/node_modules/0http/lib/next.js:15:14)
fredy    |     at _cookieSession (/fredy/node_modules/cookie-session/index.js:135:5) {
fredy    |   errno: -13,
fredy    |   syscall: 'open',
fredy    |   code: 'EACCES',
fredy    |   path: '/fredy/db/.users.json.tmp'
fredy    | }
fredy exited with code 1

I tried this on 3 different virtual Hosts running Ubuntu 22.10, one ARM64 machine and two x64 machines.

New Notification Adapter

Hi,

This is really an amazing!

How can I code a new adatper? I've copied as template an existing one, but it seems that the provider are registered or listet anywhere which will put into the select input field.

Thanks a lot!

Best,
Tobi

Residential and datacenter strategy for Immobilienscout24 scraping

Is your feature request related to a problem? Please describe.
Datacenter scraping for Immobilienscout24 is successful too but may require more retries and a bit slower, while residential is faster and more expensive.

Describe the solution you'd like
Allow 2 strategies for Immobilienscout24 scraping:

  1. Datacenter-only - retry N times with datacenter proxies (note: also retry when the status code is 404, it's a known behavior for this specific proxy pool)
  2. Residential-included - try with datacenter (better with retries) and then switch to residential

So it would be possible to decide whether use residential or not, but the retry would always apply.

Additional context
Datacenter-only approach would always return a successful result, but it might take some time, while the residential-included approach would be faster and more expensive. It's a result of ScrapingAnt's custom proxy pool feature applied for Fredy.

Telegram notifications quickly run into rate limit

Is your feature request related to a problem? Please describe.
I have set up a search with 4 of the big flat search providers and set up a Telegram bot to inform me and my girlfriend of the new flats it finds. At least when sending to a channel the telegram bot has a rate limit of only 10 messages every minute, and fredy regularly goes over this when I turn it on in the morning.
Also I would prefer to set the bots messages to only have 1 flat per message, so we can delete uninteresting messages easily, but if I change the code setting for how many flats get sent in a message I go over the rate limit even faster.

Describe the solution you'd like
I would love it if fredy would know of the telegram rate limit and wait for a minute every 9 messages to slowly but error free feed me my flat search results. Regrettably my Typescript skills are not up to implement this change.

Describe alternatives you've considered
If this is not possible I might just set up a telegram bot for every search provider. It seems that Fredy is not crosschecking for duplicates between providers anyway?

Additional context

Error: Request failed with status code 429
    at createError (D:\dev\experiments\fredy\node_modules\axios\lib\core\createError.js:16:15)
    at settle (D:\dev\experiments\fredy\node_modules\axios\lib\core\settle.js:17:12)
    at IncomingMessage.handleStreamEnd (D:\dev\experiments\fredy\node_modules\axios\lib\adapters\http.js:322:11)
    at IncomingMessage.emit (node:events:538:35)
    at endReadableNT (node:internal/streams/readable:1345:12)
    at processTicksAndRejections (node:internal/process/task_queues:83:21) {
  config: {
    transitional: {
      silentJSONParsing: true,
      forcedJSONParsing: true,
      clarifyTimeoutError: false
    },
    adapter: [Function: httpAdapter],
    transformRequest: [ [Function: transformRequest] ],
    transformResponse: [ [Function: transformResponse] ],
    timeout: 0,
    xsrfCookieName: 'XSRF-TOKEN',
    xsrfHeaderName: 'X-XSRF-TOKEN',
    maxContentLength: -1,
    maxBodyLength: -1,
    validateStatus: [Function: validateStatus],
    headers: {
      Accept: 'application/json, text/plain, */*',
      'Content-Type': 'application/json',
      'User-Agent': 'axios/0.26.1',
      'Content-Length': 333
    },
    method: 'post',
    url: 'https://api.telegram.org/bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage',
    data: '{"chat_id":"-677533133","text":"<i>Mietwohnung um Bad Oldesloe</i> (immonet) found <b>3</b> new listings:\\n\\n<a href=\\"https://www.immonet.de/angebot/47748318\\"><b>HH-Harburg Zentrum: 3 Zimmer Whg. mit Balkon</b></a>\\nHamburg Harburg | Miete zzgl. NK 1.200 € | ca. 89.0 m²\\n\\n","parse_mode":"HTML","disable_web_page_preview":true}',
    'axios-retry': { retryCount: 0, lastRequestTime: 1657048583661 }
  },
  request: <ref *1> ClientRequest {
    _events: [Object: null prototype] {
      abort: [Function (anonymous)],
      aborted: [Function (anonymous)],
      connect: [Function (anonymous)],
      error: [Function (anonymous)],
      socket: [Function (anonymous)],
      timeout: [Function (anonymous)],
      prefinish: [Function: requestOnPrefinish]
    },
    _eventsCount: 7,
    _maxListeners: undefined,
    outputData: [],
    outputSize: 0,
    writable: true,
    destroyed: false,
    _last: true,
    chunkedEncoding: false,
    shouldKeepAlive: false,
    maxRequestsOnConnectionReached: false,
    _defaultKeepAlive: true,
    useChunkedEncodingByDefault: true,
    sendDate: false,
    _removedConnection: false,
    _removedContLen: false,
    _removedTE: false,
    _contentLength: null,
    _hasBody: true,
    _trailer: '',
    finished: true,
    _headerSent: true,
    _closed: false,
    socket: TLSSocket {
      _tlsOptions: [Object],
      _secureEstablished: true,
      _securePending: false,
      _newSessionPending: false,
      _controlReleased: true,
      secureConnecting: false,
      _SNICallback: null,
      servername: 'api.telegram.org',
      alpnProtocol: false,
      authorized: true,
      authorizationError: null,
      encrypted: true,
      _events: [Object: null prototype],
      _eventsCount: 10,
      connecting: false,
      _hadError: false,
      _parent: null,
      _host: 'api.telegram.org',
      _readableState: [ReadableState],
      _maxListeners: undefined,
      _writableState: [WritableState],
      allowHalfOpen: false,
      _sockname: null,
      _pendingData: null,
      _pendingEncoding: '',
      server: undefined,
      _server: null,
      ssl: [TLSWrap],
      _requestCert: true,
      _rejectUnauthorized: true,
      parser: null,
      _httpMessage: [Circular *1],
      [Symbol(res)]: [TLSWrap],
      [Symbol(verified)]: true,
      [Symbol(pendingSession)]: null,
      [Symbol(async_id_symbol)]: 2443,
      [Symbol(kHandle)]: [TLSWrap],
      [Symbol(kSetNoDelay)]: false,
      [Symbol(lastWriteQueueSize)]: 0,
      [Symbol(timeout)]: null,
      [Symbol(kBuffer)]: null,
      [Symbol(kBufferCb)]: null,
      [Symbol(kBufferGen)]: null,
      [Symbol(kCapture)]: false,
      [Symbol(kBytesRead)]: 0,
      [Symbol(kBytesWritten)]: 0,
      [Symbol(connect-options)]: [Object],
      [Symbol(RequestTimeout)]: undefined
    },
    _header: 'POST /bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage HTTP/1.1\r\n' +
      'Accept: application/json, text/plain, */*\r\n' +
      'Content-Type: application/json\r\n' +
      'User-Agent: axios/0.26.1\r\n' +
      'Content-Length: 333\r\n' +
      'Host: api.telegram.org\r\n' +
      'Connection: close\r\n' +
      '\r\n',
    _keepAliveTimeout: 0,
    _onPendingData: [Function: nop],
    agent: Agent {
      _events: [Object: null prototype],
      _eventsCount: 2,
      _maxListeners: undefined,
      defaultPort: 443,
      protocol: 'https:',
      options: [Object: null prototype],
      requests: [Object: null prototype] {},
      sockets: [Object: null prototype],
      freeSockets: [Object: null prototype] {},
      keepAliveMsecs: 1000,
      keepAlive: false,
      maxSockets: Infinity,
      maxFreeSockets: 256,
      scheduling: 'lifo',
      maxTotalSockets: Infinity,
      totalSocketCount: 4,
      maxCachedSessions: 100,
      _sessionCache: [Object],
      [Symbol(kCapture)]: false
    },
    socketPath: undefined,
    method: 'POST',
    maxHeaderSize: undefined,
    insecureHTTPParser: undefined,
    path: '/bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage',
    _ended: true,
    res: IncomingMessage {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 4,
      _maxListeners: undefined,
      socket: [TLSSocket],
      httpVersionMajor: 1,
      httpVersionMinor: 1,
      httpVersion: '1.1',
      complete: true,
      rawHeaders: [Array],
      rawTrailers: [],
      aborted: false,
      upgrade: false,
      url: '',
      method: null,
      statusCode: 429,
      statusMessage: 'Too Many Requests',
      client: [TLSSocket],
      _consuming: false,
      _dumped: false,
      req: [Circular *1],
      responseUrl: 'https://api.telegram.org/bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage',
      redirects: [],
      [Symbol(kCapture)]: false,
      [Symbol(kHeaders)]: [Object],
      [Symbol(kHeadersCount)]: 18,
      [Symbol(kTrailers)]: null,
      [Symbol(kTrailersCount)]: 0,
      [Symbol(RequestTimeout)]: undefined
    },
    aborted: false,
    timeoutCb: null,
    upgradeOrConnect: false,
    parser: null,
    maxHeadersCount: null,
    reusedSocket: false,
    host: 'api.telegram.org',
    protocol: 'https:',
    _redirectable: Writable {
      _writableState: [WritableState],
      _events: [Object: null prototype],
      _eventsCount: 3,
      _maxListeners: undefined,
      _options: [Object],
      _ended: true,
      _ending: true,
      _redirectCount: 0,
      _redirects: [],
      _requestBodyLength: 333,
      _requestBodyBuffers: [],
      _onNativeResponse: [Function (anonymous)],
      _currentRequest: [Circular *1],
      _currentUrl: 'https://api.telegram.org/bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage',
      [Symbol(kCapture)]: false
    },
    [Symbol(kCapture)]: false,
    [Symbol(kNeedDrain)]: false,
    [Symbol(corked)]: 0,
    [Symbol(kOutHeaders)]: [Object: null prototype] {
      accept: [Array],
      'content-type': [Array],
      'user-agent': [Array],
      'content-length': [Array],
      host: [Array]
    }
  },
  response: {
    status: 429,
    statusText: 'Too Many Requests',
    headers: {
      server: 'nginx/1.18.0',
      date: 'Tue, 05 Jul 2022 19:16:29 GMT',
      'content-type': 'application/json',
      'content-length': '109',
      connection: 'close',
      'retry-after': '5',
      'strict-transport-security': 'max-age=31536000; includeSubDomains; preload',
      'access-control-allow-origin': '*',
      'access-control-expose-headers': 'Content-Length,Content-Type,Date,Server,Connection'
    },
    config: {
      transitional: [Object],
      adapter: [Function: httpAdapter],
      transformRequest: [Array],
      transformResponse: [Array],
      timeout: 0,
      xsrfCookieName: 'XSRF-TOKEN',
      xsrfHeaderName: 'X-XSRF-TOKEN',
      maxContentLength: -1,
      maxBodyLength: -1,
      validateStatus: [Function: validateStatus],
      headers: [Object],
      method: 'post',
      url: 'https://api.telegram.org/bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage',
      data: '{"chat_id":"-677533133","text":"<i>Mietwohnung um Bad Oldesloe</i> (immonet) found <b>3</b> new listings:\\n\\n<a href=\\"https://www.immonet.de/angebot/47748318\\"><b>HH-Harburg Zentrum: 3 Zimmer Whg. mit Balkon</b></a>\\nHamburg Harburg | Miete zzgl. NK 1.200 € | ca. 89.0 m²\\n\\n","parse_mode":"HTML","disable_web_page_preview":true}',
      'axios-retry': [Object]
    },
    request: <ref *1> ClientRequest {
      _events: [Object: null prototype],
      _eventsCount: 7,
      _maxListeners: undefined,
      outputData: [],
      outputSize: 0,
      writable: true,
      destroyed: false,
      _last: true,
      chunkedEncoding: false,
      shouldKeepAlive: false,
      maxRequestsOnConnectionReached: false,
      _defaultKeepAlive: true,
      useChunkedEncodingByDefault: true,
      sendDate: false,
      _removedConnection: false,
      _removedContLen: false,
      _removedTE: false,
      _contentLength: null,
      _hasBody: true,
      _trailer: '',
      finished: true,
      _headerSent: true,
      _closed: false,
      socket: [TLSSocket],
      _header: 'POST /bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage HTTP/1.1\r\n' +
        'Accept: application/json, text/plain, */*\r\n' +
        'Content-Type: application/json\r\n' +
        'User-Agent: axios/0.26.1\r\n' +
        'Content-Length: 333\r\n' +
        'Host: api.telegram.org\r\n' +
        'Connection: close\r\n' +
        '\r\n',
      _keepAliveTimeout: 0,
      _onPendingData: [Function: nop],
      agent: [Agent],
      socketPath: undefined,
      method: 'POST',
      maxHeaderSize: undefined,
      insecureHTTPParser: undefined,
      path: '/bot5116548208:AAHO8HuYUnlOFe8GTBHQdn9qNEWk5nRW1hU/sendMessage',
      _ended: true,
      res: [IncomingMessage],
      aborted: false,
      timeoutCb: null,
      upgradeOrConnect: false,
      parser: null,
      maxHeadersCount: null,
      reusedSocket: false,
      host: 'api.telegram.org',
      protocol: 'https:',
      _redirectable: [Writable],
      [Symbol(kCapture)]: false,
      [Symbol(kNeedDrain)]: false,
      [Symbol(corked)]: 0,
      [Symbol(kOutHeaders)]: [Object: null prototype]
    },
    data: {
      ok: false,
      error_code: 429,
      description: 'Too Many Requests: retry after 5',
      parameters: [Object]
    }
  },
  isAxiosError: true,
  toJSON: [Function: toJSON]
}

login issues

After running the docker image i´m able to start fredy but the login user and PW are wrong.

The folder db was missing after downloading and it seems that fredy cant find users.json.tmp
Whats wrong ?

Screenshot 2024-03-08 at 20 30 11

INstalled on Synology Container Manager

Telegram " description: 'Bad Request: message is too long'"

Hi, awesome App! However, it would be greate to have smtp notification provider and I receive this error some times using telegram:
data: {
ok: false,
error_code: 400,
description: 'Bad Request: message is too long'
}
},
isAxiosError: true,
toJSON: [Function: toJSON]

perhaps split too long messages in multiple messages ?

Telegram Adapter 'Too Many Requests'

Describe the bug
I am using fredy the first time and created a new Job using the Telegram Adapter. Initially a lot of listings were found. In the log I see many HTTP 429 Responses like this:

[AxiosError: Request failed with status code 429] {
  code: 'ERR_BAD_REQUEST',
  config: {
    transitional: {
      silentJSONParsing: true,
      forcedJSONParsing: true,
      clarifyTimeoutError: false
    },
...

I think the issue is that there currently is no limit of how many messages are sent. The telegram Bots FAQ says "avoid sending more than one message per second" and "your bot will not be able to send more than 20 messages per minute to the same group" (see https://core.telegram.org/bots/faq#my-bot-is-hitting-limits-how-do-i-avoid-this).

So I think there should be some delay in between requests to the telegram API and maybe also some kind of retry mechanism if sending a message fails.

Add landlord/owner/offerer and address option

I'd love to have the opportunity to scrape landlords and the related addresses (when given) for mapping purposes. I'm particularly interested in housing groups and less concerned with private names or the like.

The tool I am looking for is for (journalistic) research purposes. It would be great if you could add such a feature.

The only alternative I can see is to search it by hand, which would be much more time-consuming.

"Unexpected end of file" when trying to scrape Immowelt

Whenever I try to setup Immowelt I get this error code.

"Error while trying to scrape data. Received error: unexpected end of file"

It doesn´t scrape anything off of Immowelt, even with a new Config and Database .json file.
I´ve checked the formatting and there doesn´t seem to be any issues there.

How can I resolve this?

This is the specific String I used with fredy:
https://www.immowelt.de/liste/berlin/wohnungen/mieten?ami=30&d=true&pma=700&sd=DESC&sf=TIMESTAMP&sp=1

Error message on console

Steps to produce

  • Created an Immoscout provider with an valid url
  • creating telegram adapter (testing message received)

On every request, I see an error like that:
Error while trying to scrape data. Received error: Request failed with status code 423

Immoscout: Parse listing from basesearch given html source & dump search results

Given the results of a base search, I would like to parse the immoscout html source for a given listing in order to filter by attributes that are not exposed by the immoscout search functionality.

For instance, fields like "Bezugsfrei ab" or "Ausstattung" are relevant cues as to whether a user might be interested in a search result or not. Is this feature available in Fredy or how could it be added (to Fredy or in an offline parser)?

Immoscout support

Supporting Immoscout would be great :). Tried out a bit and so far it looks good!

Use residential proxies for Immoscout

I'd suggest an enhancement for Immoscout scraping.
As for my observation, the standard ScrapingAnt proxies are unstable in the scope of the detection.

My suggestion is the following:

  1. Try using standard proxy
  2. If detected - retry with residential

Also, as an alternative, retry using standard proxies can be added before using residential:

  1. Try using standard proxy
  2. If detected - retry using standard N times
  3. if still detected - retry with residential

Residential request costs more but looks like it is cheaper than retry with standard proxies.

Possible to exclude estate agents?

Hello Guys,
first of all: Love your project, thank you!!!

I'd like to find listings without real estate agents or at least exclude SOME of them. We have 3 or 4 pretty evil ones in our city that I would never buy/rent from. It doesn't seem like the blacklist excludes them (only tried immowelt) would it be possible to add that as a feature?

Looking forward to your reply!

Support for meinestadt.de

Is your feature request related to a problem? Please describe.
Hi, thanks for this awesome project - hopefully we will find a nice flat in future...
How about adding support for:

https://www.meinestadt.de/frankfurt-am-main/immobilien/wohnungen

They have nice listings.

Additional context

Con: The site uses post requests to process querys.
Pro: We have a nice parsable xhr object in Json to parse (dev-console in browser)

400 Bad Request returned by telegram api, "can't parse entities", logging could be improved as well

I had set up a telegram adapter. The chat ID was taken from @RawDataBot. Further, I have validated the token by running

curl -X GET https://api.telegram.org/botREDACTED:REDACTED successfully.

The fredy dashboard says it has found 20 listings, but there was an error while sending the message to telegram. It would help if there were more debug statements, as currently it's unclear what the api call payload was, only the response is logged.

shell@ubuntu-2gb-fsn1-6:~/fredy$ yarn run start
yarn run v1.22.5
$ node index.js
Started Fredy successfully. Ui can be accessed via http://localhost:9998
Started API service on port 9998
ErrorClass [HTTPError]: Response code 400 (Bad Request)
    at IncomingMessage.<anonymous> (/home/shell/fredy/node_modules/yarl/main.js:179:22)
    at IncomingMessage.emit (node:events:381:22)
    at endReadableNT (node:internal/streams/readable:1307:12)
    at processTicksAndRejections (node:internal/process/task_queues:81:21) {
  host: 'api.telegram.org',
  hostname: 'api.telegram.org',
  method: 'POST',
  path: '/botREDACTED:REDACTED/sendMessage',
  response: {
    statusCode: 400,
    statusMessage: 'Bad Request',
    body: `{"ok":false,"error_code":400,"description":"Bad Request: can't parse entities: Can't find end of the entity starting at byte offset 58"}`
  }
}


Make fredy runnable on raspberry pi

Hey,

over the weekend I tried to set up a Docker on my Raspberry Pi 3 with Fredy running.
The image building fails at RUN yarn install with an non-zero code: 133, even though a manual yarn install yields no errors.
When trying to use the available image, it fails, too, because the Raspberry Pi is using a different hardware architecture.

Has anybody else tried to run in on a Pi and could share some insights?

Immoscout scraping started failing (almost?) always

It appears that scraping immoscout stopped working reliably. This is not actually a bug, but maybe changes at Immoscout? Either way, all my Immoscout Providers keep failing, both, with datacenter and residential proxies. I don't think I've seen a successful Immoscout scrape in days (but I also was on my own patched fork before, now I updated to current master and it's the same, all retries seem to be failing).

Can someone check or reproduce this? Or might this be some problem on my side?
Maybe @kami4ka can shed some light from the scraping ant side? :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.