top of page

Free indicators

Public·5 members

Adaptive Cycle + RNN Anomaly Detection (NEW!)

================================================================================

PART 1: BEAST ANOMALY DETECTOR SERIES

================================================================================


Three related indicators, shared as research previews. All descend from

mcdon030's original "Beast Autoencoder-RNN" published in the ThinkOrSwim

community. The originals combined an autoencoder, a simplified RNN, and kNN

anomaly scoring into a single composite detector. These versions rewrite the

mathematical core, fix architectural bugs in the learning components, and add

new self-adaptive elements.


Research Preview | TradingView code is included

Results have not yet been formally tested or backtested.



================================================================================

SECTION A — BEAST MK II: ADAPTIVE CYCLE EDITION

================================================================================


Beast Mk II — Adaptive Cycle

Autoencoder anomaly detection with Ehlers adaptive cycle confirmation.


Note: This indicator is currently in research preview on ThinkOrSwim. Results

have not yet been formally tested or backtested. All observations are

qualitative.


The Idea


Most anomaly detectors trigger on a single condition — a volatility spike, a

volume outlier, an RSI extreme. The problem is that any one of those signals

fires constantly in normal markets. What's rare is when several structurally

different measures flag the same bar at the same time.


Beast Mk II monitors three independent views of the market — reconstruction

error from a rolling autoencoder, true k-nearest-neighbor distance on close,

and the direction of an Ehlers adaptive cycle — and only fires when they

agree, inside a market regime filter. The autoencoder tells you the market

is behaving unlike its recent self. The kNN tells you this bar is unusually

far from its nearest neighbors in recent history. The cycle tells you

momentum has recently turned. When all three line up under high volume or

expanded volatility, something structurally unusual is happening.



Plain English


Think of the main plot as a "strangeness ratio." It lives near zero during

normal chop and drifts upward when the market starts behaving unlike itself.

When it crosses the yellow dashed line at 1.0, the indicator is saying

"this bar doesn't fit the pattern of recent bars." When the ratio stays

above 1.0 for two or more bars and the other confirmations agree, the bar

turns red — that's the composite anomaly signal.


The cycle label below the plot shows whether Ehlers' adaptive cycle is

turning up or down, and whether a direction change happened within the last

3 bars (marked with an asterisk). A recent turn is one of the confirmation

signals that can gate an anomaly flag.


The alpha label shows how much the cycle has adapted to recent conditions.

Alpha starts at 0.20 and hill-climbs based on the cycle's own trade win

rate — stepping smaller (more responsive) when the cycle is too slow for

current price action, larger (more stable) when it's too fast. The win rate

in parentheses is the cycle's realized accuracy on its own direction-change

trades. It needs at least 5 completed trades before the adapter is trusted.



Reading the Chart


Visual Element What It Means

──────────────────────────────────────────────────────────────────────

Main plot (green) Reconstruction error ratio — below threshold, normal state

Main plot (orange) Ratio above 1.0 but signal not yet persistent

Main plot (red) Composite anomaly signal is active

Yellow dashed line Threshold at 1.0 — the line that matters

Cyan line EMA-smoothed error ratio — slower confirmation

"Anomaly" DETECTED or Normal, with color

"Regime" Current market regime (Volume / Volatility / Both / Normal)

"Persist" How many consecutive bars the signal has been active

"Ratio" Current ratio value in decimal form

"Cycle" Cycle direction; asterisk = recent turn

"Alpha" Adapted alpha value + cycle win rate + trade count

──────────────────────────────────────────────────────────────────────



Technical Architecture


Layer 1 — Autoencoder Reconstruction Error

Three features (VWAP, shares-per-tick, true range) are independently

normalized with z-score over a 5-bar window, smoothed through a Hull Moving

Average, and compressed via short-period averaging. The compressed signal

is then expanded back through a mirror decoder. The absolute difference

between normalized input and reconstructed output becomes the per-feature

reconstruction error. Final error is the arithmetic mean of the three

normalized errors — ensuring no single feature dominates by scale.


Layer 2 — True k-Nearest-Neighbor Distance

For each bar, the absolute difference |close - close[i]| is computed for i

in [1, lookback]. Five sequential fold passes extract the k=5 smallest

distances without needing sortable arrays (each pass finds the smallest

value strictly greater than the previous pass's winner). The kNN score is

the mean of those five nearest distances. Unlike the original "Beast"

implementation — which computed mean absolute deviation and called it kNN

— this is the actual algorithm.


Layer 3 — Ehlers Adaptive Cycle with Win-Rate Hill Climbing

A two-pole Ehlers cycle filter runs at a reference alpha of 0.20. Each

direction change of the cycle simulates a trade; realized PnL over each

trade is tracked. When the cycle's rolling win rate falls below 0.55 and

the cycle's speed is visibly mismatched to price (cycle movement > 2x

price movement = too fast; < 0.3x = too slow), alpha is nudged by one

step (default 0.01) in the corrective direction. Bounded to

[alphaMin, alphaMin + alphaSweepCount * alphaStep]. Minimum 5 completed

trades required before adapter activates.


Layer 4 — Regime Filters

Two independent regime gates must agree for an anomaly to fire:

- Volume Break: HMA(10) of Volume RSI(14) > 49

- Volatility Break: ATR(5) > ATR(20) × 1.2


Composite Signal

Fires when: autoencoder ratio > 1.0 threshold AND emaError > mean

AND (kNN anomaly OR recent cycle turn) AND (volume break OR volatility

break), persistent for at least 2 bars.



Parameters


Parameter Default Notes

──────────────────────────────────────────────────────────────────────

encodingSize 5 Autoencoder compression window

thresholdLookback 20 Window for dynamic threshold stats

thresholdMultiplier 1.5 Std deviations above mean to trigger

emaLength 15 EMA smoothing on error

minSignalLength 2 Required persistence in bars

kNNLookback 20 Bars searched for nearest neighbors

fallbackAlpha 0.20 Initial cycle alpha

alphaMin 0.05 Floor for adaptation

alphaStep 0.01 Adaptation step size

minTradesForAlpha 5 Trades before adapter trusts win rate

cycleTurnLookback 3 Bars within which a turn counts

──────────────────────────────────────────────────────────────────────


Intended for 5-minute charts. A warning label fires on other timeframes.



What's Good About It


Three independent views: Autoencoder, kNN, and cycle each come from

different mathematical families — reconstruction, distance, and recursive

filtering. They fail in different ways, so their agreement is meaningful.


Ratio-based plot: The main plot is error divided by threshold. Above 1.0

means "above threshold" regardless of current market volatility, so the

indicator reads the same during calm and busy sessions.


Self-adapting cycle: Alpha is not a user-tuned constant. It climbs a hill

defined by the cycle's own realized trade accuracy. The user sees where it

landed and how many trades drove the adaptation.


True kNN: The original Beast indicator's "kNN" was actually mean absolute

deviation. This version implements real k-nearest-neighbor distance via

sequential fold passes — slower but mathematically honest.


Regime-gated: Anomalies only fire during volume or volatility expansion.

Prevents flagging during flat tape.



What's Uncertain or Risky


Not yet tested: No backtesting has been performed. No win rate, no Sharpe,

no walk-forward validation. Treat this as a research tool, not a proven

strategy.


Cycle adapter is hill-climbing, not full search: ThinkScript can't run

parallel recursive simulations inside a fold, so the adapter uses a

single reference cycle and nudges alpha based on speed mismatch plus

win rate. This converges slowly. In whipsaw markets it may never

converge at all.


Small sample win rate: The cycle's win rate is measured on its own

direction changes. Early in the day or on slow symbols, the trade

count is small and the win rate statistically noisy.


Active hours assumption: The threshold multiplier increases by 0.2

during US equity regular hours (09:30 to 16:00 ET). Hardcoded. Futures,

crypto, and non-US markets will not match this schedule.


k=5 is fixed: Despite the kNNK input existing, the implementation uses

5 fold passes. Changing the input has no effect. Left in for interface

parity but worth knowing.



================================================================================

SECTION B — BEAST MK III-A: ADAPTIVE CYCLE + RNN

================================================================================


Beast Mk III-A — Adaptive Cycle + RNN

All of Mk II, plus a properly implemented recurrent neural network running

in parallel.



Note: This indicator is currently in research preview on ThinkOrSwim. Results

have not yet been formally tested or backtested. All observations are

qualitative.



The Idea


Mk II confirms anomalies through an adaptive cycle — a mathematical filter

that reacts to past price. Mk III-A adds a fourth independent view: a small

neural network that actively learns. The RNN reads four binary market

features, passes them through three hidden sigmoid nodes, and outputs a

probability that the next bar will close up. Weights update every bar via

online gradient descent — actual learning, not the frozen-random-weights

state the original Beast indicator shipped with.


The RNN contributes to anomaly detection through disagreement. When the

network is confident about direction (> 0.65 or < 0.35) but realized

price action has moved the other way over the last 3 bars, that confident-

but-wrong state is itself an anomaly signal. The market is doing something

the network's learned model didn't expect.



Plain English


Everything from Mk II still applies — same main plot, same three core

confirmations, same regime gates. Mk III-A adds two extra labels.


"RNN: 0.72 Up !" means the neural network's probability output is 0.72,

it's predicting up, and the exclamation mark indicates a disagreement

anomaly — the network thinks up, but recent bars have gone down.


"RNN Acc: 58%" is the rolling accuracy of the network's predictions over

the last 20 bars. Above 55% is green, below 45% is red. This tells you

whether to weight the RNN's contribution right now. A network running at

50% accuracy is guessing.



Reading the Chart


Same as Mk II, plus:


Visual Element What It Means

──────────────────────────────────────────────────────────────────────

"RNN" Network output probability + direction + anomaly flag

"RNN Acc" Rolling 20-bar prediction accuracy

──────────────────────────────────────────────────────────────────────



Technical Architecture


Layers 1–4 from Mk II are unchanged.


Layer 5 — Recurrent Neural Network

Architecture: 4 binary inputs → 3 sigmoid hidden nodes → 1 sigmoid output.

All connections are weighted; no bias term (to reduce parameter count).


Inputs:

- x1: 1 if close > close[1] else 0

- x2: 1 if Volume RSI > Volume RSI[1] else 0

- x3: 1 if TR > Average(TR, 14) else 0

- x4: 1 if close > VWAP else 0


Forward Pass (uses previous bar's inputs to predict current bar's direction):

- Hidden: h_j = sigmoid(Σ w_ij * x_i[1])

- Output: o = sigmoid(Σ v_j * h_j)


Training:

- Loss: L = 0.5 * (o - target)² where target = (close > close[1])

- Gradient: dL/dw_ij computed via standard backpropagation

- Update: w_ij[t] = w_ij[t-1] - eta * dw_ij[t-1]

- Learning rate eta is fixed at 0.10 (this is what Option C changes)


Weight initialization: (Random() - 0.5) * 0.5 on bar 1. Small values

centered near zero prevent early sigmoid saturation.


RNN Anomaly Condition:

Network output is "confident" when |o - 0.5| > 0.15. Realized move over

the last 3 bars is computed as the majority direction. Anomaly fires

when confident prediction disagrees with realized majority.


Composite Signal Modification:

Where Mk II required (kNN OR cycleTurn), Mk III-A requires

(kNN OR cycleTurn OR rnnAnomaly). All other conditions unchanged.



Parameters (additions beyond Mk II)


Parameter Default Notes

──────────────────────────────────────────────────────────────────────

eta 0.10 Fixed learning rate

rnnConfidenceBand 0.15 Distance from 0.5 to count as confident

rnnDisagreeLookback 3 Bars of realized data for disagreement

──────────────────────────────────────────────────────────────────────


Intended for 5-minute charts.



What's Good About It


Real learning: Unlike the original Beast indicator's RNN (which was broken —

weights never updated due to a recursive assignment bug), Mk III-A runs

genuine online gradient descent with proper backpropagation through a

three-node hidden layer.


Disagreement signal: The RNN doesn't just predict — it flags when its

prediction conflicts with realized price. This gives it a distinct role

in the composite signal, rather than duplicating what the cycle already

does.


Transparent accuracy: The 20-bar rolling accuracy label tells you whether

to trust the network right now. No hidden quality score.


Parallel architecture: The RNN runs independently of the cycle. If the

network is doing well (accuracy > 55%) and the cycle is also well-tuned

(alpha converged), they can both contribute confirmation. Neither can

veto the other.



What's Uncertain or Risky


Not yet tested: No backtesting has been performed. Everything below is

qualitative speculation.


Small network, binary inputs: Four binary inputs mean only 16 possible

input patterns. Three hidden nodes with 15 total weights. This is a tiny

network. It will learn something — but the ceiling on what it can learn

is low. Binary inputs discard magnitude; "close up by a penny" looks

identical to "close up by a dollar."


Target is noisy: Predicting next-bar direction from bar-over-bar signals

is one of the harder problems in price action. 5-minute bar direction

has a large random component. If accuracy hovers around 50% indefinitely,

that's the problem talking, not a bug.


Online learning drift: The network's weights can wander during regime

shifts, potentially producing long periods of worse-than-random

predictions before accuracy recovers. The 20-bar rolling accuracy gives

you a read on this — but it's after the fact.


Composite signal is more permissive: Because RNN anomaly is ORed into

confirmation, Mk III-A fires signals that Mk II wouldn't. If that turns

out to include more false positives than true ones, Mk II is the

cleaner tool.


Random initialization: Each time the chart reloads, weights are re-

randomized. Two identical charts opened seconds apart can produce

different signals on the same bars. This is a fundamental property of

running stochastic initialization in a declarative indicator language.



================================================================================

SECTION C — BEAST MK III-C: SELF-TUNING EDITION

================================================================================


Beast Mk III-C — Self-Tuning Edition

Mk III-A with an additional feedback loop: the RNN's learning rate adapts

from its own prediction accuracy.



Note: This indicator is currently in research preview on ThinkOrSwim. Results

have not yet been formally tested or backtested. All observations are

qualitative.



The Idea


In Mk III-A, the network's learning rate (eta) is fixed at 0.10. That's a

reasonable default but it's a compromise — too low and the network adapts

slowly to new regimes, too high and the network thrashes during noisy

periods. Mk III-C closes the loop: the network's own rolling prediction

accuracy determines how aggressively it should learn.


When accuracy is high (>= 55%), the network is doing well and should

stabilize — eta decreases, reducing weight volatility. When accuracy is

low (<= 45%), the current weights aren't working — eta increases, allowing

the network to explore faster. Between 45% and 55%, eta holds.


Two independent self-adaptation loops now run simultaneously:

- The cycle's alpha adapts from cycle trade win rate (as in Mk II)

- The RNN's eta adapts from RNN rolling prediction accuracy



Plain English


Mk III-C reads identically to Mk III-A on the chart, with one label

change. Where Mk III-A shows "RNN Acc: 58%", Mk III-C shows

"Acc: 58% / Eta: 0.08" — the current rolling accuracy and the current

adapted learning rate side by side.


During calm, predictable periods, eta will drift down. During choppy or

surprising periods, eta will climb. You can watch the network "decide"

to pay more or less attention to recent data.



Reading the Chart


Same as Mk III-A, with one label change:


Visual Element What It Means

──────────────────────────────────────────────────────────────────────

"Acc / Eta" Rolling accuracy + current adapted learning rate

──────────────────────────────────────────────────────────────────────



Technical Architecture


Layers 1–5 from Mk III-A are unchanged, except the fixed eta is replaced

with an adaptive eta.


Self-Tuning Learning Rate


The adaptation rule uses the previous bar's rolling accuracy (to break the

circular dependency between weights, predictions, and accuracy):


prevAccuracy = rnnAccuracy[1]


if bn < minBarsForEta:

eta = etaInitial

else if prevAccuracy >= etaAccHigh and eta[1] > etaMin + etaStep:

eta = eta[1] - etaStep # exploit — stabilize

else if prevAccuracy <= etaAccLow and eta[1] < etaMax - etaStep:

eta = eta[1] + etaStep # explore — adapt faster

else:

eta = eta[1]


eta clipped to [etaMin, etaMax]


Everything else — forward pass, backprop, weight update formula, anomaly

condition — is identical to Mk III-A.



Parameters (additions beyond Mk III-A)


Parameter Default Notes

──────────────────────────────────────────────────────────────────────

etaInitial 0.10 Starting learning rate

etaMin 0.01 Floor

etaMax 0.50 Ceiling

etaStep 0.01 Adjustment size per bar

rnnAccWindow 20 Window for rolling accuracy

minBarsForEta 30 Bars before adapter activates

etaAccHigh 0.55 Above this, decrease eta

etaAccLow 0.45 Below this, increase eta

──────────────────────────────────────────────────────────────────────


Intended for 5-minute charts.



What's Good About It


Adaptive learning rate: The network tunes itself based on how it's doing,

not based on a value someone picked six months ago. This is genuine

closed-loop adaptation.


Two adaptation loops, independent: The cycle adapts from its own win

rate; the RNN adapts from its own accuracy. Neither depends on the

other's success. If one adapter is working well and the other isn't,

the indicator doesn't collapse.


Visible feedback: The eta value in the label makes the adaptation

auditable. You can see whether the network is confident (low eta) or

searching (high eta) at any moment.


Exploit-explore pattern: The adapter implements a simple, principled

version of the exploit-explore tradeoff from reinforcement learning.

Good performance → settle. Bad performance → try new weight

configurations faster.



What's Uncertain or Risky


Not yet tested: No backtesting has been performed. Everything below

is qualitative.


One-bar adaptation lag: Because accuracy this bar depends on weights

this bar depends on eta this bar — the feedback loop is broken by using

the previous bar's accuracy to set this bar's eta. The indicator always

reacts one bar late. In principle this is fine; in practice it means

adaptation won't be instant.


Potential oscillation: If accuracy hovers near the 0.45–0.55 boundary,

eta will stabilize. If it oscillates across the boundary, eta will

oscillate too. Whether this damps out or amplifies depends on the

market. Keep an eye on whether eta is converging or bouncing.


Adaptation does not fix bad architecture: A 15-weight network on 4

binary inputs has a ceiling regardless of how well its learning rate

is tuned. Self-tuning makes the network reach its ceiling faster. It

does not raise the ceiling.


Compounding uncertainty: Mk III-C has two active adaptation loops (eta

and alpha), plus the RNN's weights, plus the autoencoder's response.

Debugging unexpected behavior is harder than in Mk II. When in doubt,

fall back to Mk II for a simpler read.


All Mk III-A risks still apply: Random initialization, small network

ceiling, binary input limitation, online learning drift.



================================================================================

CREDITS & LINEAGE

================================================================================


Original "Beast Autoencoder-RNN for Anomaly Detection": mcdon030 (2025),

enhanced by Grok. Shared in the ThinkOrSwim community under the original

file header. The architecture of the original — combining autoencoder

reconstruction error, an RNN, and a kNN-style distance score — is

preserved in these versions.


What these versions change:

- Fixed the RNN's frozen-weights bug (recursive assignment via CompoundValue)

- Replaced the simplified 2-input single-node RNN with 4-input 3-hidden-node

architecture and proper backpropagation

- Replaced mean-absolute-deviation-called-kNN with real kNN (k=5 smallest

distances via sequential fold passes)

- Removed regularization penalty that was subtracted from activations

(regularization belongs in the loss function, not the output)

- Removed injected random noise from the encoder (denoising autoencoders

need noise during training, not during streaming inference)

- Added the Ehlers adaptive cycle layer with hill-climbing alpha adaptation

(Mk II, III-A, III-C)

- Added the RNN disagreement anomaly mode (Mk III-A, III-C)

- Added self-tuning learning rate (Mk III-C only)

- Replaced raw reconstruction error plot with ratio-to-threshold plot

for consistent visual interpretation across market conditions

- Added Alert() call on fresh signals


All three versions are free, shared in the spirit of open research.

MarketFragments.com is building toward a community of rigorous, data-driven

traders — these tools are an invitation to collaborate. If you test any of

them and have results, we want to hear from you.



================================================================================

DISCLAIMERS

================================================================================


- All three indicators are research previews

- No backtesting has been performed

- No win rate, no Sharpe, no walk-forward validation, no statistical

significance testing

- Qualitative observations only

- Live trading conditions including slippage, fills, and timing may

produce outcomes that differ from chart appearance

- Not financial advice

- This is a research and educational tool


Attachments:


Tradingview

Beast Autoencoder + Adaptive Cycle for Anomaly Detection:


Thinkscript:

Beast Autoencoder + Adaptive Cycle for Anomaly Detection:

Beast Autoencoder + Adaptive Cycle + RNN for Anomaly Detection:

Beast Autoencoder + Adaptive Cycle + Self-Tuning RNN for Anomaly Detection:


23 Views
Brain with financial data analysis.

Inquiries at :

Important Risk Notice: Trading involves substantial risk of loss. This is educational content only—not advice. Full details here  ------------>  

Proceed only if you're prepared.

tel#: (843) 321-8514

bottom of page