The Most Active and Friendliest
Affiliate Marketing Community Online!

“Adavice”/  “CPA

My first breakeven journey

My original plan is to just blacklist ASN when asn.organization_name contains word like "Data Center" or similar. How can you blacklist ASN by location? Can you give me some details?
How well is that going to work? run this grep if you are using this file:
Bash:
$  head -n1 geolite2-asn-blocks-ipv4.csv
network,autonomous_system_number,autonomous_system_organizatio

$ grep -i 'amazon'  geolite2-asn-blocks-ipv4.csv

Amazon is a leading source of bots, scrapers, ad-spy tools and you can see "Data Center" or similar is no good for many cases.

by live lookup I mean to get host
(PTR) some never reply and wait for a time out PHP gethostbyname() in particular never has issues in bash CLI

no reverse DNS no email server --not human but that can be evaded too --residential proxies.

example:
$ host 54.174.53.35
Host 35.53.174.54.in-addr.arpa. not found: 3(NXDOMAIN)

That was Hubspot bot trying to validate one of my domains; it gets 408 and 403 depending on the requests. Not the best example as it has nothing to do with ad campaigns (the instant topic).

You have to really look at this after the fact --to fine tune any (almost accurate) results.

So, you think NGINX is funny? That is a leading production server. Got an opinion on Apache2 also :D Use what you like.
I send 408 replies to the bogies I am detecting.

Elastic search I have used in the past. Hope you don't plan to waste time tracking forged and out of date user agents --there are a shit-ton of those on the ad networks :rolleyes: you are planning to use. Tell Chrome 64 I said hello (and GFY) you want to be able to reject stuff like that at the RTB source via the networks. Browser Versions are a necessity --browsers auto-update and the current real ones use HTTP/2.0.
If its not HTTP/2.0 most all of the time its a bot (or it could be selenium or chrome headless).
The hardest part of tracking ad network traffic is determining the traffic's real validity --to the best of your abilities. If you can sort out headless chrome (selenium) and disregard its events --you will solve most of the real tracking issues today. Accomplish that.


This is the only useful thing I got from this post

If you are using rotating proxies to make your IPs appear different (only) on the fly your browser ID will be same if you are using Webdriver and Selenium/Chrome headless. The open source says the IP is not a factor --this is an advantage :D

The biggest problem in tracking is understanding what you are seeing and what the real value is of the traffic.
 
If you are using rotating proxies to make your IPs appear different (only) on the fly your browser ID will be same if you are using Webdriver and Selenium/Chrome headless. The open source says the IP is not a factor --this is an advantage :D
^Above is not true --the webpage lied ...
5435c138d767341071dc81685321cad7 < ssh tunnel (VPN)
1945084e5fff6486c760762ac9511648 < ISP IP
same browser, same computer --that didn't work as expected at all.
 
Every GUI interface I have seen. I think on some ad network apis you may be able to post a CIDR list but what the limits are.

wc -l /etc/nginx/conf.d/block_asns-hisec.conf 36209 /etc/nginx/conf.d/block_asns-hisec.conf
most lines are /24 /20 etc
The problem is, if you can't block those traffic on the ad networks side, you still have to pay for it.

What I do is, I only block traffic at ad networks side, and track every traffic pass through it. I can filter out those junk traffic on my reports (I add a field 'suspicous' in my table, I can easily filter them out on Kibana Later. What's better, I can even filter on suspious score, so I can adjust the threthold dynamically).
 
You will be smacking moles forever. You will see.

Click Cease and Click Guard both do a great job of preventing click fraud and filtering the crap out of traffic. I know many like to think "I'll just create my own", but honestly, as a marketer, is this really the best use of your time?

I've been in the computer industry since 1981. I've been a manufacturer of mainframe and mini-mainframe supply side products throughout the 80's, a software developer for the banking industry throughout the nineties, and an online marketer since the late nineties. While I have been a restaurant and nightclub owner and an owner operator of several other industry businesses, I can tell you factually that developing in-house tools while in the midst of launching a new business is a setback to your goals and not a benefit.

When I opened Automated Data Supply in 1981 the primary goal was to build the most successful provider of all supply side needs in the mainframe and mini-mainframe markets. I did it in two years. That is when I expanded into manufacturing the products myself and contracting for blank label products I could not manufacture. I waited until the core of the business reached its original primary goal before expanding it. This is a critical requirement in the modeling of business launches. This is taught at every higher learning institution on the planet because it is founded in thousands of years of examples.

Do yourself a favor and spend your time learning, implementing, and getting to a point with your marketing that it stands on its own and with a team. Then start to expand with developments of new ventures into the industries space. Not to do so will likely result in a very long and drawn out process that in the end is likely to provide an abundance of discouragement.

I'm not saying don't do it, I am saying "one step at a time"!
 
I've never seen these numbers match one another!
That's funny; why?
I once sent redirected traffic from one server with a tracker I wrote => to a second server with a tracker I wrote; and.
The IN=>OUT=>IN was ±1% (or so)
Possible conclusions:
  1. shaving,
  2. incompetent statistics coding,
  3. receiving network overloads and errors,
  4. undisclosed user (bot) filtering on reporting servers.
or a combination of factors.
The results may vary by the offer's server (target conversion) and the day, time, random.

Good luck to the winners!
 
I know who's knocking on my door and where I sent them

Yes, but you are the exception, not the rule.

I can't spend the time for developing such things. My plate is overflowing all of the time and I simply do not want to spend the money for an application that will be in need of constant updates and rewrites. That's another game of whack-a-mole.

Spending $100 or $200 a month, or more, is cheap compared to the cost of having a highly trained full stack person re-creating something that is already on the market. I am in a couple of Joint Venture groups on FB, and elsewhere and I have had this discussion with some of the biggest in the business and none of them, not one, would be interested in going that route of self development for this purpose. They all implement a subscribed service.

We all admire your ability and talent, and I am not a slouch when it comes to development, but I prefer placing talent on my team and to do that for recreating a platform for this purpose is a massive endeavor when done right and would run into many tens of thousands to achieve the same quality.

I'll bring the data once I've completed my tests with Click Cease. It was recommended to me by several of the extremely high end private lux charter companies I promote and these are companies with very deep pockets, yet they choose to subscribe rather than develop.
 
I'm going to utilize "Amazon", "Data Center", FingerprintJS, and Google Recaptcha V3 in my tracker to flag some abnormal traffic, then find some patterns, and use them in my auto optimizer to send blacklist signal to Ad network through API.
My experience has been to analyze the traffic IPs after the fact and create internal ASN blocks. These networks are not able to use CIDR "properly" Nginx is easy for that ... so you block one IP and they will use another soon.
allow 209.85.238.0/24; #google favicon # "org": "AS396982 Google LLC" deny 64.233.173.0/24; #Google-Proxy scraper deny 64.233.173.137/32; #Google-Proxy scraper #Google cloud LLC (Whois) deny 107.178.192.0/18; # GOOGLE-CLOUD US deny 66.102.6.0/24;#google-proxy #googlebot#deny 66.249.64.0/19; ALLOW #GOOGLE-CLOUD-PLATFORM

grep -ic 'GOOGLE-CLOUD-PLATFORM' geolite2-asn-blocks-ipv4.csv 365 grep -ic 'GOOGLE-CLOUD-PLATFORM' geolite2-asn-blocks-ipv6.csv 22 #that is a count of the entire CIDRs now block a million? IPs one-by-one LOL
Bottom line: you will have to sort, reject and eat the costs of a lot of substandard and some non human traffic. Part of the game. 'Landed' RTB traffic has cost me $0.07 -$0.43 (mostly tier 1). I am not really sure of the 'Landed' traffic's quality --see how many perform a CPA event.
 
Headless chrome can be setup to evade all captcha versions now.
I just have some ideas to share here.
If chrome doesn't send back bot score, then I will just mark this traffic suspicious.
If chrome send a wrong bot score, what will never happen, because I have to get the score on my server side.

For chrome doesn't support javascript, I will use
<noscript>
<img src="/noscript.png">
<noscript>
to mark those suspicous traffic.

But I still need some practice to prove them work. LOL
 
Last edited:
I'm doing final test on my tracker today. I found a problem.
I have no problem sending push notification to Windows and Android Chrome, but for Firefox, `alerts.useSystemBackend` is `false` by default, which means almost no end users will receive my notifications. And for Android Firefox, only beta version can config `alerts.useSystemBackend` in `about:config` page.
I'd like to know how you guys deal with this. (I just want to ignore all users using Firefox)

Thank you.
@T J Tutor @Graybeard
 
What is 'Landed' RTB traffic?
Landed is a term I learned from my uncle, he owned a small chain of furniture stores; One summer vacation from high school, I worked in the stores with him, writing price tags for the furniture.
He taught me to use the 'landed cost' in the case of the merchandise on the display floor. We calculated what we bought the units for plus the train-load freight we paid. The landed cost was the 'landed' at our location cost of goods acquired (COGA) < made that acronym up :)

In business, the term "landed cost" refers to the total cost of acquiring and delivering a good or service to the point of sale. This includes the cost of the good or service itself, as well as any taxes, duties, shipping, and insurance costs incurred along the way.

When I buy ad media, I see it as an investment. I want to get the most bang for my buck, so to speak.

That's why I only want to count the valid ad hits or referrals. i.e.; "total cost of acquiring and delivering" the ad to my target offer (money page).

The junk traffic and obvious non-human bot traffic is worthless and discarded. By only counting the valid ad hits, I am able to get a better realistic metric on the actual cost, value and performance of the media I buy. This allows me to make more informed decisions on where to allocate my ad budget. I can also track the performance of individual ad campaigns more accurately. This is important to me because I want to know which campaigns are performing well and which ones need to be tweaked or scrapped altogether.

The contra argument is the affinity of the offer --will it sell enough and pay-out enough to justify the ad expense? Where to find the weak link in the overall situation is always a conundrum.

Overall, counting only valid ad hits is the best way to get a true picture of the value of ad media. It allows me to make more informed decisions and measure DSP traffic performance more accurately.

RTB
is a real time bidding platform (ad auction hubs) the ad networks buy a lot of their traffic at these 'auctions' and resell it on a DSP (demand side platform) --the dashboard you see at 'the' DSP network.

These networks? You mean those Ad Networks? like propellerads?

Every GUI interface I have seen. I think on some ad network apis you may be able to post a CIDR list but what the limits are.

wc -l /etc/nginx/conf.d/block_asns-hisec.conf 36209 /etc/nginx/conf.d/block_asns-hisec.conf
most lines are /24 /20 etc
 
Last edited:
alerts.useSystemBackend
is coming from where?
Type 'about:config' in firefox address bar.

I'm not sure if pop works, I will only try push traffic, since I think pop is too aggressive. I just have no idea why aggressive traffic works. Why do people even click on a page that pops out from nowhere. I will definitely close it immediately.

By the way, would it be too many if I send push notification every hour? I'm afraid to be blocked by chrome. :rofl
 
@Graybeard
1663847064345.png

False is the default on my side, I use the latest version.
 
Let's see, what I have come up so far.
1. I'm going to use native traffic on Tier 1 English speaking countries.
2. I'm going to collect push subscriptions on my landing page.
3. I'm going to integrate Revcontent (OpenRTB 2.5). That way I can block suspicious traffic without paying for it. (T^T Thank you so much @Graybeard. In fact, I'm still hesitant on which ads networks to choose, but it seems Revcontent have less restrictions and qualified traffic compared to others. Any advice is super appreciated. )
4. I'm going to promote top 10 ClickBank offers.
5. I'm going to use collected push subscriptions for MaxBounty offers.

I'm not sure how much I will spend to get my first conversions this way.
I hope I don't go bankrupt. Clapping
 
Where are the developer docs --all I see is HTML *knowledge base* BS
Specifically with regard to CIDR .
If I understand correctly, you can send HTTP status code 204 when you don't want to bid. In this way, you can block any traffic that is suspicious and useless, and just target the target persona (Could you share me with the income and housing data, is it some public database?).

You can get zip code of any ip by using this free database from Maxmind.
```
$ head -n1 ./GeoLite2-City-Blocks-IPv4.csv
network,geoname_id,registered_country_geoname_id,represented_country_geoname_id,is_anonymous_proxy,is_satellite_provider,postal_code,latitude,longitude,accuracy_radius
```

Their support haven't replied to my email yet. But from this page: Open RTB Bid Request Specifications | ExoClick Documentation, you can have a rough idea of what you can get.

cookie sites
What are cookie sites?
 
Let me know how that works out ... You will be smacking moles forever. You will see.
I would rather block 3 million IPs that I know are useless. 0 effort now.
Then try to whitelist at the DSP --the rest are the cost of doing business.

Let's get going move some traffic and see ...
 
Happy Weekend everybody!

I'm going to write a small site with about 10 articles (two each day), so that I can utilize google ads later.
That's what I'm going to do the next week. The week after the next, I'm starting to promote my first offer!!!!!!!!!!!!!
 
Click Cease and Click Guard both do a great job of preventing click fraud and filtering the crap out of traffic.
Show me a data report of what is blocked an why?
I looked at Click Cease and all I see is Marketing BS.
The api only shows a curl
on what metrics do you base that decision?

Headless chrome can be setup to evade all captcha versions now.
That is why it is whack-a-mole ...

IMO tools are good for some purposes. But usually they are for the 'non technical user'.
Also, that tools cost is $90 /mo forever and you still have to monitor it.

I know who's knocking on my door and where I sent them :p
54.36.148.16,[21/Sep/2022:15:19:50,+0000],"GET,/,HTTP/2.0",302,154,"-","Mozilla/5.0,(compatible;,AhrefsBot/7.0;,+http://ahrefs.com/robot/)" 54.36.148.208,[21/Sep/2022:15:19:48,+0000],"GET,/robots.txt,HTTP/2.0",302,154,"-","Mozilla/5.0,(compatible;,AhrefsBot/7.0;,+http://ahrefs.com/robot/)" 185.220.100.249,[21/Sep/2022:12:51:59,+0000],"GET,/,HTTP/1.1",302,165,"http://www.000.000","Mozilla/5.0,(Windows,NT,10.0;,Win64;,x64),AppleWebKit/537.36,(KHTML,like,Gecko),Chrome/79.0.3945.130,Safari/537.36"
 
Hi everybody,
Because of my shiny object syndrome, I'm going to post pone my first promotion to Oct 10, 2022.

Following is my new plan:
1. I'm going to use just one landing page and one offer from ClickBank.
2. I'm going to collect both email list and push subscribers on my landing page.
3. I'm going to generate 5 blogs with the help of AI copywriting tools (as my free traffic source), and inject some in-text ads to my landing page and offer page repesctively.
4. I'm going to use both Google Search and Native as my paid traffic source.

PS: I don't expect to get free organic traffic from the 5 generated blogs, but Google Search don't allow single page website as far as I know, so I need them anyway.
 
Last edited:
Yes, but you are the exception, not the rule.
I go far beyond that:
2022-05-28 12:09:44,1653739784,69723890082,593931,108870268971,79.228.169.105,HTTP/2.0,"Mozilla/5.0 (Windows NT 10.0; Win64; x 64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.5005.63 Safari/537.36" 2022-05-28 12:16:19,1653740179,29510504854,520468,108870390884,188.195.26.38,HTTP/2.0,"Mozilla/5.0 (Windows NT 10.0; Win64; x6 4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36" 2022-05-28 12:17:43,1653740263,69658679838,647550,108870414655,79.247.206.237,HTTP/2.0,"Mozilla/5.0 (Windows NT 10.0; Win64; x 64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36"

I know who clicked out and and when (in some cases what CTA link) from a landing page or a push ad that was direct.

They are just csv files that can be loaded into a database server on a cron --when needed.
I wrote the code, I am not selling the code, there is no customer support, and no guarantee of maintenance.

bikabon is doing the same thing, his way, and adding the GUI.

Generally, I don't need GUIs. Rather not need to learn them nor have the added work of configuring them. We used them but on a site with 1 million unique events a day. Then it's time effective IMO as you need to have people with all levels of skills, and missions, using them.
 
where does google say that
link?

but Google Search don't allow single page website as far as I know

i can rank 1 page site top
only whats on that page?
if you post answer to eternal life
and its not scam
you got high qual inbound links
google will rank your page top
true or false @Graybeard
?
 
single page sites are still ignored these days?<- I fixed it.
In fact, I did mean "allowed". I was talking about buy Google Search traffic. I heard people complain that their Google Ads accounts get banned, but I don't know what they had done exactly. But others say it's safe to have a full function website (not just a landing page) to avoid get banned by Google.
 
alerts.useSystemBackend
is coming from where?
(I just want to ignore all users using Firefox)
I do that with push now^ but IIRC pop works OK?
I use FF and there are default settings blocking push --except when it is 'allowed' in the browser.
Chrome is 70%+ of the traffic and 99% of the 'targets' anyway.
But I still need some practice to prove them work. LOL
I tried something similar --chrome headless will run JavaScript --just like a browser would --0 detects
 
MI
Back