Lucky Spin: Godly Programming-Chapter 41: Explaining 1

If audio player doesn't work, press Reset or reload the page.

Chapter 41: Chapter 41: Explaining 1

Jeff did not reply. He simply stood there with a neutral expression.

He understood that his teacher would not grasp it, not now. But he had a hunch that one day, his teacher and many more programmers would.

Time was what technology needed to advance, just as it had been in his own world in his first life.

While this world was still learning to walk, he had already learned how to fly.

Here, they were still teaching loops and conditions as though they were the peak of innovation.

As students continued to struggle with functions and arrays, some were even proud of building calculators and login forms.

While some teachers spoke of databases as if they were the highest achievement in system design.

But Jeff had already walked a path far beyond that. He had written code capable of understanding language.

He had built models that did not merely follow instructions but knew how to reason. These models could predict and adapt.

In this world, there were no papers discussing transformers. There were no lectures explaining neural embeddings.

The word "tokenizer" held no meaning here.

Attention mechanisms, softmax functions, and vector spaces were concepts that did not even exist in their vocabulary.

How could he explain why each word in a sentence requires positional encoding?

How could he make them understand why self-attention layers are essential?

Even if he told them, would his teacher and other can truly grasp what it means for a machine to learn?

To others, code was just a tool. To him, code was a living thing, something that could grow, evolve, and understand.

"If I told them the real process," Jeff thought inwardly as he continued.

"About how I trained the system to weigh context, about masking sequences and calculating loss functions, they would look at me as if I were insane."

"They’re still at the stage of telling machines what to do, line by line. Clearly they have never witnessed the outcome when a machine can independently determines what is logical," he said inwardly.

Dela Cruz’s eyes narrowed as he scrolled through the function list.

"This grammar checker of yours it doesn’t feel like the usual spellcheckers. The logic seems layered," he said, his tone filled with curiosity as he waited for a response.

"Explain this to me, Jeff. How exactly did you make this work?"

Jeff nodded slightly, with a spark of energy in his eyes. Since he always enjoyed explaining and teaching since this was the moment he loved the most.

"So it starts with the tokenizer," he began, pointing toward the function at the top of the screen.

...

Python

def tokenize_input(sentence):

tokens = sentence.lower().split()

return tokens

...

"It’s that word again. What is this tokenizer?" Dela Cruz thought inwardly, showing a puzzled look.

Since he had seen this code and Jeff had mentioned it earlier. Still unsure, Dela Cruz stared at the function name on the screen, his brows furrowed in confusion.

"Tokenizer... what exactly do you mean by that? Is it just splitting the sentence by spaces?" he asked, genuinely curious but clearly unfamiliar with the concept.

Jeff gave a small nod as he raised one finger, "That’s part of it, teacher. But tokenizing is not just about splitting words. It is about breaking the input into meaningful pieces and that is called tokens."

Jeff explained this, understanding his teacher’s lack of knowledge, since in his world at the moment, AI is not yet advanced and is only beginning its development.

Just like in the earlier part, concepts like transformers, tokenizers, and natural language processing remain unheard of.

So it just makes sense that his teacher, even as the head of the computer lab, would not fully understand what ’tokenizer’ really means.

Jeff knew that his teacher might understand basic string manipulation, such as using split or replace functions in Python, but not the deeper meaning of tokenization as it relates to language models, natural language processing, or AI processing.

He leaned slightly closer to the monitor, pointing at the example: ƒreewebɳovel.com

...

Python

sentence = "She walks to the market."

tokens = ["she", "walks", "to", "the", "market", "."]

...

"As you can see here, teacher, each word and even the punctuation get separated. The system reads every token individually so that it can check grammar, structure, or meaning at the smallest level."

Dela Cruz nodded slowly, but inside he was running out of expressions of appreciation as he silently listened.

But Jeff could see his teacher’s eyes trembling as he explained, so he continued.

"Think of it as chopping up a sentence into puzzle pieces. Instead of reading the sentence as one large chunk, the system looks at each piece one by one. This makes it easier to catch errors such as incorrect verb usage or missing articles."

"In some places, tokenization can get much deeper, like breaking down contractions, handling numbers, or separating symbols. But for this system, I kept it simple."

Dela Cruz scratched his head, as he leaned back, "I see, so it’s not just split by spaces, but something smarter than that."

Jeff gave a polite smile, "Exactly, teacher. It allows the system to ’see’ the sentence the way a teacher would, word by word, piece by piece, brick by brick" he said, though he did not actually include the phrase brick by brick when he spoke.

"I see," Dela Cruz replied as he touched his chin.

Jeff then continued, tapping on the next function.

...

Python

def grammar_checker(tokens):

error_count = 0

# rule-based grammar checks here

...

"After tokenizing, the checker runs each word and its neighboring words through a set of conditions, which are the grammar rules I manually set. These include things like subject-verb agreement, common tense mismatches, and passive constructions."

Dela Cruz leaned in closer to listen. When Jeff finished speaking, Dela Cruz asked.

"Manually set? You wrote all the rules yourself?"

Jeff gave a small smile, the kind that neither confirmed nor denied.

"Let’s just say I had a library of patterns ready. I tuned them to fit the kinds of mistakes that students often make, like local context and common writing habits."

Dela Cruz nodded, taking in the information Jeff had shared. Scrolling further, Jeff stopped at another block.

...

Python

def feedback_score(error_count):

if error_count == 0:

return "Excellent work!"

elif error_count <= 3:

return "Good, but consider revising some parts."

else:

return "Needs improvement. Please review your grammar."

...

[Author’s Note: Apologies to the reader Wilder6, as I cannot fully explain this without providing a code example. I hope you understand, my brotha🙏]

"And this scoring system, what does it do?" Dela Cruz asked.

Jeff leaned slightly forward, pointing at the code.

"It is the part that gives feedback based on how many grammar errors the system detects." His tone remained steady and casual as he pointed to the first condition with his finger.

The code he was pointing at was error_count == 0, with a return value of Excellent work.

"If the sentence passes without any mistakes, the system praises the writing, telling the user it is excellent."

"So if there are no errors, and when the error count equals zero, the program will send back the message, ’Excellent work.’"

Then he pointed to the next block, elif error_count <= 3, with a return value of Good, but consider revising some parts.

"This shows that if there are only 1 to 3 errors, it suggests some minor revisions but still acknowledges that the writing is mostly good."

"So if the error count is 3 or less, the program will respond with, ’Good, but consider revising some parts.’ "

Finally, he tapped the last line else with a return value of, ’Needs improvement. Please review your grammar"

"In this last code, if the error count goes above 3 and none of the previous conditions are met, the program will respond with, ’Needs improvement. Please review your grammar.’

"It is a simple threshold-based system. The main idea is to give feedback that is clear and motivating, rather than simply saying ’wrong’ or ’correct.’ It helps the user, or rather the students, to understand how far they are from clean writing and encourages them to improve without feeling discouraged," he ended.