Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't add additional custom tokenizer or renderers? #1693

Open
alystair opened this issue May 31, 2020 · 9 comments
Open

Can't add additional custom tokenizer or renderers? #1693

alystair opened this issue May 31, 2020 · 9 comments

Comments

@alystair
Copy link

@alystair alystair commented May 31, 2020

What pain point are you perceiving?.
I'm reviewing Marked documentation, attempting to create a custom setup where, it transforms new lines starting with 'notice: ' into a specifically formatted DIV. By my understanding I need to first add a custom named tokenizer and then a renderer based on it? Or am I going about this the wrong way?

Describe the solution you'd like
I'd like to easily add this functionality without creating a separate post-processing tool external to Marked...

@KilianKilmister
Copy link

@KilianKilmister KilianKilmister commented May 31, 2020

You are propably able to get by creating a walk-token. They basically act as an intermediate processing step before the data is handed to the renderer. Their ability is somewhat limited, but it should be plenty for things like this.

@alystair
Copy link
Author

@alystair alystair commented May 31, 2020

I tried the following but it doesn't work. Does tokenizer only substitute existing tokens? Can you not add your own custom ones?

const tokenizer = {
	tipNotice(src) {
		const match = src.match(/\nnotice: (.*)\n/);
		if (match) {
			return {
				type:'tipNotice',
				raw:match[0],
				text:match[1].trim()
			};
		}
		return false;
	}
};
const renderer = {
	tipNotice(text) {
		return `<div class="tip notice">${text}</div>`;
	}
};
marked.use({ tokenizer, renderer });
marked(content,{ headerIds:true });
@alystair alystair changed the title Do I need a custom tokenizer or renderer? Can't add additional custom tokenizer or renderers? May 31, 2020
@KilianKilmister
Copy link

@KilianKilmister KilianKilmister commented Jun 1, 2020

@alystair Both the Parser and the Lexer implementation currently have their structure hardcoded. So you can so there currently is no straight way of altering what they hand to the Tokenizer and Renderer, as those are "stupid" as they are build tor speedy processing.

And as it turnes out, it is pretty much impossible to make invasive modifications short of altering the source-code or meta programming a weil.

Whoever wrote the core of this package can really pad himself on the back, it is as solid as granit.

#1695 i filed an issue about that yesterday

@alystair
Copy link
Author

@alystair alystair commented Jun 1, 2020

Argh... in that case I'll have to replace several built-ins like paragraph, and review the source code to expand on the current functionality. Fun.

@KilianKilmister
Copy link

@KilianKilmister KilianKilmister commented Jun 1, 2020

A little hint: copy the repo and directly modify the sourcecode i spent a full day trying to alter the parser and lexer from inside my module and i have nothing to show for it, as marked is stable as hell.

But the code is modern and well formatted. Navigating it is a breze

@UziTech
Copy link
Member

@UziTech UziTech commented Jun 22, 2020

This is something that could be useful. If someone wants to create a PR I would be happy to review it.

@dvkndn
Copy link

@dvkndn dvkndn commented Aug 21, 2020

@UziTech the PR is to add "token" param to renderer methods, right? So for example the "paragraph" one will receive "token" in addition to "text"?

@UziTech
Copy link
Member

@UziTech UziTech commented Aug 21, 2020

No the issue is for the ability to add additional tokenizers/renderers. For example adding an "underline" tokenizers and renderer to extend markdown.

@UziTech
Copy link
Member

@UziTech UziTech commented Aug 21, 2020

Moving to the renderer taking tokens instead of certain parameters is something that will have to be done on a major version bump but I think it would help with this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants
You can’t perform that action at this time.