Add new limit option to limit the requested amount of posts from the Reddit API (#203)
All checks were successful
Test / build (push) Successful in 8s
All checks were successful
Test / build (push) Successful in 8s
- Add the limit option - Valid number between 1 and 100 - Defaults to 100 - Updated documentation to mention this #137 Reviewed-on: #203 Reviewed-by: VylpesTester <tester@vylpes.com> Co-authored-by: Ethan Lane <ethan@vylpes.com> Co-committed-by: Ethan Lane <ethan@vylpes.com>
This commit is contained in:
parent
6f0109ae6e
commit
2b3e65302b
12 changed files with 170 additions and 17 deletions
39
docs/cli.md
39
docs/cli.md
|
@ -4,7 +4,7 @@ Since Version 2.2, Random Bunny contains a command line interface (CLI).
|
|||
|
||||
## Downloads
|
||||
|
||||
The project can be downloaded as a binary for your system via the [GitHub Releases](https://github.com/Vylpes/random-bunny/releases) or [Gitea Releases](https://gitea.vylpes.xyz/RabbitLabs/random-bunny/releases) page.
|
||||
The project can be downloaded as a binary for your system via the [GitHub Releases](https://github.com/Vylpes/random-bunny/releases) or [Forgejo Releases](https://git.vylpes.xyz/RabbitLabs/random-bunny/releases) page.
|
||||
|
||||
We currently support:
|
||||
- Linux (x64)
|
||||
|
@ -13,6 +13,8 @@ We currently support:
|
|||
|
||||
The git repository can also be cloned and ran via `yarn build` and `yarn start`.
|
||||
|
||||
You can produce the binary using the `yarn package` command. This creates the binaries in the `./bin` folder.
|
||||
|
||||
> **NOTE:** We are aware of a bug in the macOS Arm64 builds failing to execute. For now you're still able to use the x64 builds under Rosetta fine. This will hopefully be fixed in a future release.
|
||||
|
||||
## Default Output
|
||||
|
@ -20,7 +22,7 @@ The git repository can also be cloned and ran via `yarn build` and `yarn start`.
|
|||
By default, the command will fetch a random image from `r/rabbits` and return it in a human-readable output.
|
||||
|
||||
```
|
||||
$ randombunny
|
||||
$ random-bunny
|
||||
|
||||
Archived = false
|
||||
Downvotes = 0
|
||||
|
@ -38,11 +40,11 @@ Url = https://i.redd.it/sfz0srdrimjc1.png
|
|||
The command also includes a help option in case you are stuck.
|
||||
|
||||
```
|
||||
$ randombunny --help
|
||||
$ random-bunny --help
|
||||
|
||||
# or
|
||||
|
||||
$ randombunny -h
|
||||
$ random-bunny -h
|
||||
|
||||
Usage: random-bunny [options]
|
||||
|
||||
|
@ -55,6 +57,7 @@ Options:
|
|||
-q, --query-metadata Include query metadata in result
|
||||
-o <file> Output to file
|
||||
--sort <sort> Sort by (choices: "hot", "new", "top", default: "hot")
|
||||
--limit <limit> The amount of posts to fetch from the reddit api (default: 100)
|
||||
-h, --help display help for command
|
||||
```
|
||||
|
||||
|
@ -63,11 +66,11 @@ Options:
|
|||
You can also convert the output into JSON, if you need to input it to another program.
|
||||
|
||||
```bash
|
||||
$ randombunny --json
|
||||
$ random-bunny --json
|
||||
|
||||
# or
|
||||
|
||||
$ randonbunny -j
|
||||
$ randon-bunny -j
|
||||
|
||||
{"Archived":false,"Downs":0,"Hidden":false,"Permalink":"/r/Rabbits/comments/1av1rg9/cute_baby_bun/","Subreddit":"Rabbits","SubredditSubscribers":486085,"Title":"Cute baby bun","Ups":210,"Url":"https://i.redd.it/sfz0srdrimjc1.png"}
|
||||
```
|
||||
|
@ -79,9 +82,9 @@ You can also choose the sorting option which reddit will use to return the avail
|
|||
This defaults to "hot". The valid options are "hot", "new", and "top".
|
||||
|
||||
```
|
||||
$ randombunny --sort hot
|
||||
$ randombunny --sort new
|
||||
$ randomBunny --sort top
|
||||
$ random-bunny --sort hot
|
||||
$ random-bunny --sort new
|
||||
$ random-bunny --sort top
|
||||
```
|
||||
|
||||
|
||||
|
@ -92,8 +95,8 @@ You can change the subreddit which the command fetches from.
|
|||
This defaults to "rabbits"
|
||||
|
||||
```
|
||||
$ randombunny --subreddit rabbits
|
||||
$ randombunny -s horses
|
||||
$ random-bunny --subreddit rabbits
|
||||
$ random-bunny -s horses
|
||||
```
|
||||
|
||||
## Output to file
|
||||
|
@ -103,3 +106,17 @@ If you'd rather send the output to a file, you can supply the `-o` flag.
|
|||
```
|
||||
$ randombunny -o ~/Desktop/output.txt
|
||||
```
|
||||
|
||||
## Reddit API Return Limits
|
||||
|
||||
You can also limit the amount the posts the script requests from the Reddit API
|
||||
using the `--limit` option.
|
||||
|
||||
This defaults to 100. This accepts any number between 1 and 100.
|
||||
|
||||
Please note limiting the calls to less than 100 will give a higher chance of
|
||||
the script not finding any valid image post to return.
|
||||
|
||||
```
|
||||
$ random-bunny --limit 50
|
||||
```
|
||||
|
|
|
@ -33,7 +33,7 @@ console.log(result);
|
|||
|
||||
### `randomBunny()`
|
||||
|
||||
Returns a `json string` for a random post. Accepts 2 arguments: `subreddit`, and `sortby` ('new', 'hot', 'top')
|
||||
Returns a `json string` for a random post. Accepts 3 arguments: `subreddit`, `sortby` ('new', 'hot', 'top'), and `limit` (1-100, default 100)
|
||||
|
||||
The json string which gets returned consists of:
|
||||
- archived
|
||||
|
|
|
@ -14,11 +14,12 @@ program
|
|||
.option('-j, --json', 'Output as JSON')
|
||||
.option('-q, --query-metadata', 'Include query metadata in result')
|
||||
.option('-o <file>', 'Output to file')
|
||||
.addOption(new Option('--sort <sort>', 'Sort by').default('hot').choices(['hot', 'new', 'top']));
|
||||
.addOption(new Option('--sort <sort>', 'Sort by').default('hot').choices(['hot', 'new', 'top']))
|
||||
.addOption(new Option('--limit <limit>', 'The amount of posts to fetch from the reddit api').default(100));
|
||||
|
||||
program.parse();
|
||||
|
||||
const options: ICliOptions = program.opts();
|
||||
|
||||
randomBunny(options.subreddit, options.sort)
|
||||
randomBunny(options.subreddit, options.sort, options.limit)
|
||||
.then((response) => exit(CliHelper.Endpoint(response, options)));
|
|
@ -3,4 +3,5 @@ export enum ErrorCode {
|
|||
FailedToFetchReddit,
|
||||
UnableToParseJSON,
|
||||
NoImageResultsFound,
|
||||
LimitOutOfRange,
|
||||
}
|
|
@ -2,4 +2,5 @@ export default class ErrorMessages {
|
|||
public static readonly FailedToFetchReddit = "Failed to fetch result from Reddit";
|
||||
public static readonly UnableToParseJSON = "Unable to parse the JSON result";
|
||||
public static readonly NoImageResultsFound = "No image results found in response from Reddit";
|
||||
public static readonly LimitOutOfRange = "Limit must be a number between 1 and 100";
|
||||
}
|
|
@ -3,5 +3,6 @@ export default interface ICliOptions {
|
|||
json?: boolean,
|
||||
sort: "new" | "hot" | "top",
|
||||
o?: string,
|
||||
limit: number,
|
||||
queryMetadata?: boolean,
|
||||
}
|
|
@ -1,4 +1,5 @@
|
|||
export default interface QueryResult {
|
||||
subreddit: string,
|
||||
sortBy: string,
|
||||
limit: number,
|
||||
}
|
|
@ -24,6 +24,7 @@ export default class OutputHelper {
|
|||
if (options.queryMetadata != null) {
|
||||
outputLines.push(`Query.Subreddit = ${response.Query.subreddit}`);
|
||||
outputLines.push(`Query.Sort By = ${response.Query.sortBy}`);
|
||||
outputLines.push(`Query.Limit = ${response.Query.limit}`);
|
||||
}
|
||||
|
||||
return outputLines.join("\n");
|
||||
|
|
24
src/index.ts
24
src/index.ts
|
@ -7,8 +7,23 @@ import { ErrorCode } from "./constants/ErrorCode";
|
|||
import ErrorMessages from "./constants/ErrorMessages";
|
||||
import ImageHelper from "./helpers/imageHelper";
|
||||
|
||||
export default async function randomBunny(subreddit: string, sortBy: "new" | "hot" | "top" = 'hot'): Promise<IReturnResult> {
|
||||
const result = await fetch(`https://reddit.com/r/${subreddit}/${sortBy}.json?limit=100`)
|
||||
export default async function randomBunny(subreddit: string, sortBy: "new" | "hot" | "top" = 'hot', limit: number = 100): Promise<IReturnResult> {
|
||||
if (limit < 1 || limit > 100) {
|
||||
return {
|
||||
IsSuccess: false,
|
||||
Query: {
|
||||
subreddit: subreddit,
|
||||
sortBy: sortBy,
|
||||
limit: limit,
|
||||
},
|
||||
Error: {
|
||||
Code: ErrorCode.LimitOutOfRange,
|
||||
Message: ErrorMessages.LimitOutOfRange,
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
const result = await fetch(`https://reddit.com/r/${subreddit}/${sortBy}.json?limit=${limit}`)
|
||||
.then((res) => {
|
||||
return res;
|
||||
})
|
||||
|
@ -22,6 +37,7 @@ export default async function randomBunny(subreddit: string, sortBy: "new" | "ho
|
|||
Query: {
|
||||
subreddit: subreddit,
|
||||
sortBy: sortBy,
|
||||
limit: limit,
|
||||
},
|
||||
Error: {
|
||||
Code: ErrorCode.FailedToFetchReddit,
|
||||
|
@ -38,6 +54,7 @@ export default async function randomBunny(subreddit: string, sortBy: "new" | "ho
|
|||
Query: {
|
||||
subreddit: subreddit,
|
||||
sortBy: sortBy,
|
||||
limit: limit,
|
||||
},
|
||||
Error: {
|
||||
Code: ErrorCode.UnableToParseJSON,
|
||||
|
@ -60,6 +77,7 @@ export default async function randomBunny(subreddit: string, sortBy: "new" | "ho
|
|||
Query: {
|
||||
subreddit: subreddit,
|
||||
sortBy: sortBy,
|
||||
limit: limit,
|
||||
},
|
||||
Error: {
|
||||
Code: ErrorCode.NoImageResultsFound,
|
||||
|
@ -85,6 +103,7 @@ export default async function randomBunny(subreddit: string, sortBy: "new" | "ho
|
|||
Query: {
|
||||
subreddit: subreddit,
|
||||
sortBy: sortBy,
|
||||
limit: limit,
|
||||
},
|
||||
Error: {
|
||||
Code: ErrorCode.NoImageResultsFound,
|
||||
|
@ -115,6 +134,7 @@ export default async function randomBunny(subreddit: string, sortBy: "new" | "ho
|
|||
Query: {
|
||||
subreddit: subreddit,
|
||||
sortBy: sortBy,
|
||||
limit: limit,
|
||||
},
|
||||
Result: redditResult
|
||||
};
|
||||
|
|
|
@ -25,5 +25,6 @@ Title = This is my Ms Bear!
|
|||
Upvotes = 17
|
||||
Url = https://preview.redd.it/d5yno653zf7d1.jpg?width=640&crop=smart&auto=webp&s=5064d1caec3c12ac2855eb57ff131d0b313d5e9d
|
||||
Query.Subreddit = rabbits
|
||||
Query.Sort By = hot"
|
||||
Query.Sort By = hot
|
||||
Query.Limit = 100"
|
||||
`;
|
||||
|
|
|
@ -10,6 +10,7 @@ describe("GenerateOutput", () => {
|
|||
Query: {
|
||||
subreddit: "rabbits",
|
||||
sortBy: "hot",
|
||||
limit: 100,
|
||||
},
|
||||
Result: {
|
||||
Archived: false,
|
||||
|
@ -40,6 +41,7 @@ describe("GenerateOutput", () => {
|
|||
Query: {
|
||||
subreddit: "rabbits",
|
||||
sortBy: "hot",
|
||||
limit: 100,
|
||||
},
|
||||
Result: {
|
||||
Archived: false,
|
||||
|
@ -72,6 +74,7 @@ describe("GenerateOutput", () => {
|
|||
Query: {
|
||||
subreddit: "rabbits",
|
||||
sortBy: "hot",
|
||||
limit: 100,
|
||||
},
|
||||
Result: {
|
||||
Archived: false,
|
||||
|
|
|
@ -7,6 +7,10 @@ import fetch from "got-cjs";
|
|||
jest.mock('got-cjs');
|
||||
const fetchMock = jest.mocked(fetch);
|
||||
|
||||
beforeEach(() => {
|
||||
fetchMock.mockReset();
|
||||
});
|
||||
|
||||
describe('randomBunny', () => {
|
||||
test('GIVEN subreddit AND sortBy is supplied, EXPECT successful result', async() => {
|
||||
fetchMock.mockResolvedValue({
|
||||
|
@ -231,4 +235,106 @@ describe('randomBunny', () => {
|
|||
expect(result.Error?.Code).toBe(ErrorCode.NoImageResultsFound);
|
||||
expect(result.Error?.Message).toBe(ErrorMessages.NoImageResultsFound);
|
||||
});
|
||||
|
||||
test("GIVEN limit is supplied, EXPECT limit sent to the API", async () => {
|
||||
fetchMock.mockResolvedValue({
|
||||
body: JSON.stringify({
|
||||
data: {
|
||||
children: [
|
||||
{
|
||||
data: {
|
||||
archived: false,
|
||||
downs: 0,
|
||||
hidden: false,
|
||||
permalink: '/r/Rabbits/comments/12pa5te/someone_told_pickles_its_monday_internal_fury/',
|
||||
subreddit: 'Rabbits',
|
||||
subreddit_subscribers: 298713,
|
||||
title: 'Someone told pickles it’s Monday… *internal fury*',
|
||||
ups: 1208,
|
||||
url: 'https://i.redd.it/cr8xudsnkgua1.jpg',
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
}),
|
||||
});
|
||||
|
||||
const result = await randomBunny('rabbits', 'new', 50);
|
||||
|
||||
expect(result.IsSuccess).toBeTruthy();
|
||||
expect(result.Result).toBeDefined();
|
||||
expect(result.Error).toBeUndefined();
|
||||
|
||||
expect(fetchMock).toHaveBeenCalledWith('https://reddit.com/r/rabbits/new.json?limit=50');
|
||||
});
|
||||
|
||||
test("GIVEN limit is less than 1, EXPECT error to be returned", async () => {
|
||||
fetchMock.mockResolvedValue({
|
||||
body: JSON.stringify({
|
||||
data: {
|
||||
children: [
|
||||
{
|
||||
data: {
|
||||
archived: false,
|
||||
downs: 0,
|
||||
hidden: false,
|
||||
permalink: '/r/Rabbits/comments/12pa5te/someone_told_pickles_its_monday_internal_fury/',
|
||||
subreddit: 'Rabbits',
|
||||
subreddit_subscribers: 298713,
|
||||
title: 'Someone told pickles it’s Monday… *internal fury*',
|
||||
ups: 1208,
|
||||
url: 'https://i.redd.it/cr8xudsnkgua1.jpg',
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
}),
|
||||
});
|
||||
|
||||
const result = await randomBunny('rabbits', 'new', 0);
|
||||
|
||||
expect(result.IsSuccess).toBeFalsy();
|
||||
expect(result.Result).toBeUndefined();
|
||||
expect(result.Error).toBeDefined();
|
||||
|
||||
expect(result.Error!.Code).toBe(ErrorCode.LimitOutOfRange);
|
||||
expect(result.Error!.Message).toBe(ErrorMessages.LimitOutOfRange);
|
||||
|
||||
expect(fetchMock).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
test("GIVEN limit is greater than 100, EXPECT error to be returned", async () => {
|
||||
fetchMock.mockResolvedValue({
|
||||
body: JSON.stringify({
|
||||
data: {
|
||||
children: [
|
||||
{
|
||||
data: {
|
||||
archived: false,
|
||||
downs: 0,
|
||||
hidden: false,
|
||||
permalink: '/r/Rabbits/comments/12pa5te/someone_told_pickles_its_monday_internal_fury/',
|
||||
subreddit: 'Rabbits',
|
||||
subreddit_subscribers: 298713,
|
||||
title: 'Someone told pickles it’s Monday… *internal fury*',
|
||||
ups: 1208,
|
||||
url: 'https://i.redd.it/cr8xudsnkgua1.jpg',
|
||||
},
|
||||
},
|
||||
],
|
||||
}
|
||||
}),
|
||||
});
|
||||
|
||||
const result = await randomBunny('rabbits', 'new', 101);
|
||||
|
||||
expect(result.IsSuccess).toBeFalsy();
|
||||
expect(result.Result).toBeUndefined();
|
||||
expect(result.Error).toBeDefined();
|
||||
|
||||
expect(result.Error!.Code).toBe(ErrorCode.LimitOutOfRange);
|
||||
expect(result.Error!.Message).toBe(ErrorMessages.LimitOutOfRange);
|
||||
|
||||
expect(fetchMock).not.toHaveBeenCalled();
|
||||
});
|
||||
});
|
Loading…
Reference in a new issue