Competitor Review Intelligence

Fake Review Detection System -- Powered by Robert Dove's Thesis Research
Generated: April 13, 2026 at 02:00 AM
11
Businesses Tracked
57
Reviews Analyzed
0
Flagged Suspicious
0.0%
Flagged Rate

Competitor Overview

Quick Plumbing
10.6 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 15.8
Kevin Ginnings Plumbing
7.5 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 13.3
Anthony Plumbing
7.2 Avg Score
Total Reviews (API) 6
Flagged 0 (0.0%)
Avg Rating 4.5 stars
5-Star Reviews 5
Max Fake Score 10.7
PlumbSmart
7.0 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 11.2
AB May Heating and Cooling
7.0 Avg Score
Total Reviews (API) 6
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 6
Max Fake Score 10.1
A-1 Sewer & Septic Service
6.3 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 3.4 stars
5-Star Reviews 3
Max Fake Score 8.7
Cororum Plumbing
6.2 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 13.4
Inception Plumbing
5.7 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 9.9
Bright Side Plumbing
5.5 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 6.9
Dick Ray Master Plumber
5.2 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 5.0 stars
5-Star Reviews 5
Max Fake Score 11.3
Benjamin Franklin Plumbing KC
5.1 Avg Score
Total Reviews (API) 5
Flagged 0 (0.0%)
Avg Rating 4.4 stars
5-Star Reviews 4
Max Fake Score 6.6

Flagged Reviews (Score >= 50)

Business Reviewer Rating Fake Score Review Text Top Engines
No flagged reviews yet. Run a scan first.

Detection Methodology

Six detection engines analyze each review. Scores combine into a weighted composite (0-100). Based on Robert Dove's thesis research on sentiment analysis and text mining (NW Missouri State).

Engine 1: Sentiment Uniformity

Uses Bing Liu lexicon (6,789 words). Real reviews have mixed sentiment. Fake reviews are 100% positive or negative.

Weight: 15%

Engine 2: Linguistic Authenticity

Uses DOVE Dictionary. Checks for hedging, discourse particles, hesitators, pronouns, time adverbials. Real reviews have 3+ categories.

Weight: 25%

Engine 3: Temporal Velocity

Groups reviews by date. Flags 5+ reviews on same day or 3+ in a 2-day window vs normal rate.

Weight: 20%

Engine 4: Reviewer Profile

Analyzes reviewer patterns: generic phrases, minimal text with 5 stars, excessive punctuation, all-caps.

Weight: 20%

Engine 5: Text Similarity

TF-IDF vectorization + cosine similarity. Flags review pairs with >0.7 similarity (copy/paste or templates).

Weight: 10%

Engine 6: AI Content Detection

Checks sentence length uniformity, vocabulary diversity (TTR), punctuation patterns, formal transition words.

Weight: 10%
Limitation: Google Places API returns a maximum of 5 reviews per business. These results represent a proof-of-concept sample. For comprehensive analysis, direct GBP page scraping or the Google Business Profile API (with owner access) would be needed. The analysis engines are production-ready and will scale to any volume of review data.