A few hours in GIMP - Whenever I start painting, there's always a fight between keeping it realistic, or heading off in some surreal tangent pic.twitter.com/JGd0F7wfT6
— oioiiooixiii (@oioiiooixiii) December 28, 2016
2016
I dont understand the 2016 hate. Seems to be a USA/Eurocentric thing; every year since 2007 has been "bad" in that case. But I'll play along
— oioiiooixiii (@oioiiooixiii) December 31, 2016
Start of 2016 [mouth.jpg] - End of 2016 [anus.jpg]
— oioiiooixiii (@oioiiooixiii) December 31, 2016
tweet: https://twitter.com/oioiiooixiii/status/815250339452567553
context: http://bgr.com/2016/12/30/was-2016-a-bad-year-yes-it-was/
Things would be different at Apple if Steve Jobs was still there...
Yes, certainly.
original concept 2010: https://2.bp.blogspot.com/.../s1600/Steve%2BJobs%2BSkinny_2010.jpg
【足太ぺんた】 Happy Halloween
source video: https://www.youtube.com/watch?v=fHLTXhimYao
source video: http://live.nicovideo.jp/watch/lv280004089
source video: http://live.nicovideo.jp/watch/lv280512397
Vote Amerika 2016
context: https://www.youtube.com/watch?v=8wM248Wo54U
content source: https://www.youtube.com/watch?v=6DXDU48RHLU
content source: https://www.youtube.com/watch?v=neBl8cOEGTc
FFmpeg: Extract foreground [moving] objects from video
This is a somewhat crude implementation, but given the right source material, an acceptable result can be generated. It is based on FFmpeg's 'maskedmerge' filter, which takes three input streams: a background, an overlay, and a mask (which is used to manipulate the pixels of the overlay layer).
ffmpeg \ -i background.png \ -i video.mkv \ -filter_complex \ " color=#00ff00:size=1280x720 [matte]; [1:0] format=rgb24, split[mask][video]; [0:0][mask] blend=all_mode=difference, curves=m='0/0 .1/0 .2/1 1/1', format=gray, smartblur=1, eq=brightness=30:contrast=3, eq=brightness=50:contrast=2, eq=brightness=-10:contrast=50, smartblur=3, format=rgb24 [mask]; [matte][video][mask] maskedmerge,format=rgb24 " \ -shortest \ -pix_fmt yuv422p \ result.mkv
For this process, a still background image is needed. An extracted frame from the video will do, or if the background is constantly obscured, it may be necessary to manually create a clean image from multiple frames (stacking multiple frames may produce better results too).
The background image is 'difference' blended with the video, to produce the mask which will be used with the 'maskedmerge' filter. This video stream is then converted to grayscale and adjusted to maximise the contrast levels. [N.B. The video format changes multiple times with different filter effects, and so 'format=rgb24' is set in each filterchain for colour compatibility.]
The curves and equilisation filtering is a bit hard to explain, and due to to lack of a real time preview, somewhat "hit and miss". Basically, a 'threshold' filter is being built, where just black and white areas are created. The eq/curve filters here progressively squeeze the tones together in such a way that only the wanted areas are solid white. This will change for each project, and the shown filter chain has been progressive "hacked together" for this specific video.[N.B. 'maskedmerge' interprets tonality as levels of pixel opacity in the overlay layer]
The first 'smartblur' filter fills out (dilates) the areas to create more solid structures in the mask. The second 'smartblur' filter blends the edges of the mask to create a softer cutout. Additional 'smartblur' filters can be used on the background and on the video stream it is blended with, which will act as a noise filter to cull stray momentary differences.
The final element is a new background for the extracted elements to sit upon. In this example, a simple green matte is generated. This, along with the created mask, and original video, are are provided as input for the 'maskedmerge' filter.
There are many ways this can be implemented, adjusted, and improved. In the example above, everything is done within one filtergraph, but it can be separated out into multiple passes (this would be useful for manually fixing errors in the mask). [N.B. Timing can be an issue when running this all in a single filtergraph (where the mask layer didn't match up with the overlay). 29.97fps videos proved particularly troublesome. Repeated use of 'setpts=PTS' in filter graph might help, but it this case, it was fixed by converting the video to 25fps beforehand.]
UPDATE: 2020-05-05
There is some recurring confusion over what I wrote about stacking multiple frames for the background image. It's really not that important; it's just something to help create a more general/average background image by image stacking.
The image-stacking process is just to create a cleaner background image to work with. The idea is to remove momentary anomalies by averaging frames together. The benefits may be negligible though. Image-stacking can be done many ways. I created a quick demo for you using FFmpeg:
— oioiiooixiii (@oioiiooixiii) November 4, 2019
# Image stacking with FFmpeg usinf 'tmix' filter. # More info on 'tmix' filter: https://ffmpeg.org/ffmpeg-filters.html#tmix ffmpeg -i background-frame%d.png -vf tmix=frames=3 stacked.png # Image stacking is also possible with ImageMagick convert *.png -evaluate-sequence mean stacked.png
ffmpeg maskedmerge: https://ffmpeg.org/ffmpeg-filters.html#maskedmerge
source video: ぷに (Puni) https://www.youtube.com/watch?v=B0o8cQa-Kd8
Discussion of technique on twitter: https://twitter.com/alihaydarglc/status/982950986175209472
FFmpeg: Create a video composite of colourised macroblock motion-vectors
# Generate video motion vectors, in various colours, and merge together # NB: Includes fixed 'curve' filters for issue outlined in blog post ffplay \ -flags2 +export_mvs \ -i video.mkv \ -vf \ " split=3 [original][original1][vectors]; [vectors] codecview=mv=pf+bf+bb [vectors]; [vectors][original] blend=all_mode=difference128, eq=contrast=7:brightness=-0.3, split=3 [yellow][pink][black]; [yellow] curves=r='0/0 0.1/0.5 1/1': g='0/0 0.1/0.5 1/1': b='0/0 0.4/0.5 1/1' [yellow]; [pink] curves=r='0/0 0.1/0.5 1/1': g='0/0 0.1/0.3 1/1': b='0/0 0.1/0.3 1/1' [pink]; [original1][yellow] blend=all_expr=if(gt(X\,Y*(W/H))\,A\,B) [yellorig]; [pink][black] blend=all_expr=if(gt(X\,Y*(W/H))\,A\,B) [pinkblack]; [pinkblack][yellorig]blend=all_expr=if(gt(X\,W-Y*(W/H))\,A\,B) " # Process: # 1: Three copies of input video are made # 2: Motion vectors are applied to one stream # 3: The result of #2 is 'difference128' blended with an original video stream # The brightness and contrast are adjusted to improve clarity # Three copies of this vectors result are made # 4: Curves are applied to one vectors stream to create yellow colour # 5: Curves are applied to another vectors stream to create pink colour # 6: Original video stream and yellow vectors are combined diagonally # 7: Pink vectors stream and original vectors stream are combined diagonally # 8: The results of #6 and #7 are combined diagonally (opposite direction)
NB: At time of writing, the latest version of FFmpeg (N-81396-g0d8b6a1) has a bug (feature?) where upper and lower bounds of 'curves' filter must be set for accurate results. This is contrary to what's written in official documentation.
alternate version:
see related: http://oioiiooixiii.blogspot.com/2016/04/ffmpeg-display-and-isolate-macroblock.html
source video: 足太ぺんた (Asibuto Penta) https://www.youtube.com/watch?v=Djdm7NaQheU
Ian Paisley did 9/11
Steve Jobs: The Eternal Leader
FFmpeg: Video Stabilisation using 'libvidstab'
It is possible to stablise video with standard FFmpeg using the 'deshake' filter, which can produce satisfactory results¹. Another option is to use FFmpeg with the 'vid.stab' library.
In the video above, a side-by-side comparison is made between the original video, and the 'vid.stab' stablised version. The subject matter remains still, while the video content floats around the frame. This is achieved by setting the 'zoom' to a negative value, 'optzoom' to 0, and setting 'relative' to 1. This is not typically desired in most instances, as it creates unusual framing. However, it does mean that no picture information is lost in the process. Note also, how missing information is replaced by the content of previous frames². The other option is to leave these areas black. Further settings information found at Georg Martius's website.
I've created a Bash script to aid in setting the values for video stabilisation (link at end of post). It was intended as a way of getting to grips with different settings, rather than a final application. It initialises a crude interface using Zenity, however all options can be set with this quickly, and it will build complete filters for the first and second pass. It also creates a video using FFmpeg's default values for MKV files. It produces a rudimentary log file, as follows:
**** Wed Jul 6 17:02:07 IST 2016 **** ARRAY VALUES: |10||||||||200|||0|||1|-50|0|||| vidstabdetect=result=transforms.trf:shakiness=10:accuracy=15:stepsize=6:mincontrast=0.3:tripod=0:show=0 vidstabtransform=input=transforms.trf:smoothing=200:optalgo=gauss:maxshift=-1:maxangle=0:crop=keep:invert=0:relative=1:zoom=-50:optzoom=0:zoomspeed=0.25:interpol=bilinear:tripod=0:debug=0 # Info: # 1: Time and date of specific filtering. The filter choices of each run on a video gets added to the same log file. # 2: Clearly shows the user specified values for filtering (blanks between '|' symbols indicate default value used) # 3: Filtergraph used for first pass # 4: Filtergraph used for second pass
To use 'vib.stab' features in FFmpeg, FFmpeg must be compiled using the following procedures (correct as of this post's date)
# Using the FFmpeg compilation method for GNU/Linux, found here # https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu # Add (or complete) the following to pre-compile procedures # --------------------------------------------------------- cd ~/ffmpeg_sources wget -O vid-stab-master.tar.gz https://github.com/georgmartius/vid.stab/tarball/master tar xzvf vid-stab-master.tar.gz cd *vid.stab* cmake . make sudo make install # --------------------------------------------------------- # When compiling FFmpeg, include '--enable-libvidstab' in './configure PATH' # Create necessary symlinks to 'libvidstab.so' automatically by running sudo ldconfig
On a final note, vid.stab refuses to work with videos of certain pixel formats, so I encoded all test video as 'yuv420p' which worked without a problem.
¹ Deshake has an advantage over vid.stab, in that it allows setting a region for motion search.
² In some other systems (like 'Deshaker' for VirtualDub ) missing frame information can be interpolated from bi-directional frame analysis.
bash script: http://oioiiooixiii.blogspot.com/p/context-download-binbashset-e-x-script.html
vid.stab home: http://public.hronopik.de/vid.stab/features.php
vid.stab github: https://github.com/georgmartius/vid.stab
initial reading: https://www.epifocal.net/blog/video-stabilization-with-ffmpeg
source video: えんそく (Ensoku) https://www.youtube.com/watch?v=jWdQMgBlXEo
Der kleine kobold
John Kirby reminds me of Jim Carrey
source video: https://www.youtube.com/watch?v=kpLhwcJH7-o
source video: https://www.youtube.com/watch?v=geiS49_p84Q
North Korea is a serious threat to America...
"Can you see the imperialists, dear Leader?" - "No, not a sign of them. Our tireless rhetoric has frightened them away!"
America could flatten North Korea in a heartbeat; could flatten it before North Korea got the kindling smoldering in one of it's glorified fireworks; but it hasn't. The debate goes back and forth: America won't attack a country with nuclear weapons - America won't remove a country that provides it a perfect excuse to keep a massive military presence in the area.
context: http://fair.org/uncategorized/north-korea-rattles-sabres-meanwhile-u-s-pretends-to-drop-nuclear-bombs-on-them/
Judas confronts Jesus
context: https://www.youtube.com/watch?v=PaKIZ7gJlRU
context: https://www.youtube.com/watch?v=8EW03GCm74M
Michelle Smith: More Drug than Human, but...
In 1998 Smith failed a drugs test when some urine was found in her whiskey sample. The shame of it. The sad reality dawned on us that all those gold medals in '96 were probably won from Smith injecting herself with Miracle Grow. For a while, some of us kept on believing. Part of RTE's coverage of the '96 Games, included the commentary of Dr. Gary O'Toole, a doctor who previously competed for Ireland in swimming. He had predicted every one of Smith's victories (and even her final bronze medal), and he said nothing to concern us over her sudden rise to Olympic champion. It later turned out that, and I was quite shocked by this, Gary O'Toole was ordered by RTÉ, Ireland's state-run broadcaster, to shut his mouth and not mention his concerns:
"The directive came down that nobody was to discuss drugs and Michelle Smith on national television" source: http://community.seattletimes.nwsource.com/archive/?date=20000814&slug=4036702Michele Smith's Olympic victories stand, as she never tested positive for banned substances during the 'Games. Whatever really went on with drug-taking, it's hard to forget the vitriol shown by the loser-Americans of 1996. "She broke the rules" a bitter, twisted-faced, Janet Evans whined. As we all know: if you beat the Americans at something, they will come after you until they find a way to destroy you. That case of "sample tampering" of 1998 still seems suspicious; a tad convenient; like they couldn't catch her with the drugs, so they spiced up the samples themselves. Everyone cheats in the end.
Show me the athlete, and I'll find you the substance.*Partial content originally published: August, 2012
— oioiiooixiii (@oioiiooixiii) March 24, 2016
more info: https://en.wikipedia.org/wiki/Michelle_Smith
TOIlet (toilet) & FFmpeg: Capturing (formatted) terminal text output as video
for i in {0..9}; do echo "WONDERFUL $i"; done \ | toilet --gay \ | ffmpeg -f tty -i - tty-out.gif # — oioiiooixiii {gifs} (@oioiiooixiii_) June 15, 2016
To use TOIlet formated text on a webpage:
# Basic syntax for html output toilet -f smmono9 --html oioiiooixiii
# Strips out <br /> tags to improve formatting in blogger htmlText="$(toilet -f smmono9 --gay --html oioiiooixiii)" echo "${htmlText//"<br />"/""}"more info: http://libcaca.zoy.org/toilet.html
FFmpeg: The recursive effects of stacking tblend filters
# Twelve stacked 'tblend=all_mode=difference128' filters. # Deblocking 'spp' and 'average' filters used to mimimise strobing effects. # Due to latency issues, the result in ffplay will differ from a ffmpeg rendering. ffplay \ -i video.mp4 \ -vf \ " scale=-2:720, tblend=all_mode=difference128, tblend=all_mode=difference128, tblend=all_mode=difference128, spp=4:10, tblend=all_mode=average, tblend=all_mode=difference128, tblend=all_mode=difference128, tblend=all_mode=difference128, spp=4:10, tblend=all_mode=average, tblend=all_mode=difference128, tblend=all_mode=difference128, tblend=all_mode=difference128, spp=4:10, tblend=all_mode=average, tblend=all_mode=difference128, tblend=all_mode=difference128, tblend=all_mode=difference128 "
In the video above, 4 versions of the effect are shown. In order:
- The effects of the tblend filter chain as given
- A tblend fliter chain four-times as long
- The 'mode' alternates between 'difference128' and 'difference'
- A 'average' mode between each 'difference128' tblend filter
Here, the result of the shown tblend filter chain is combined with the original video, using various modes. In order:
- all_mode=difference128
- all_mode=difference
- all_mode=multiply128
- all_mode=multiply
- c0_mode=difference128
- unknown 'average_test.mkv'
- unknown 'all_average_test.mkv'
- unknown 'difference_test.mkv'
- unknown 'difference128_test.mkv'
- Original source material
source video: https://www.youtube.com/watch?v=50lAMbJUXfc
That time when Bill Clinton lost the nuclear launch codes
The thermonuclear biscuit is safe and sound in the ass vault, Mr. President.
* partial blog content originally published in 2010
context: http://www.theatlantic.com/politics/archive/2010/10/why-clintons-losing-the-nuclear-biscuit-was-really-really-bad/65009/
Putin orders execution of two walruses so their blood can be used as a stimulant for Russian Olympic swimmers
context: http://www.telegraph.co.uk/news/2016/07/21/two-walruses-die-at-vladimir-putin-backed-oceanarium/
Putin's Secret Dope-Rocket and the Cybernetic Army of Superhuman Russian Athletes
Putin gives you wings!
Woke to radio this morning; someone saying "Putin is responsible". Dont know what he's responsible for this time; jumped out of bed laughing
— oioiiooixiii (@oioiiooixiii) July 19, 2016
context: http://www.irishexaminer.com/archives/2016/0722/opinion/ban-all-russian-athletes-at-rio--russian-doping-411674.htmlPutin has a big needle filled with the dope juice. He jabs it in the athletes bum-bums whenever they visit the Kremlin.
— oioiiooixiii (@oioiiooixiii) July 20, 2016
more info: http://thechronicleherald.ca/world/1382083-the-latest-russian-athletes-back-us-based-klishina-for-rio
Pardon me while I have a strange interlude...
Jen hasn't smiled since March 24th. Well, a smile or two perhaps, but no laughter; not like before. No, things are definitely different now. Chilled, and tense; like frozen beef. I think the end is in sight, and they're all just going through the motions. Strange how the wind blows tonight.context: https://twitter.com/oioiiooixiii/status/742822155713425409
context: https://youtu.be/T6RkPxawpY0?t=2m30s
context: ...
Increase monitor DPI with xrandr
# MoreDPI - increase resolution beyond monitor limits alias moredpi="xrandr --output VGA1 --scale 1.25x1.25 --panning 1280x960"
In this case, the upper resolution limit of the monitor [VGA1] is 1024x768 (4:3). The screen is scaled down by 25% [--scale 1.25x1.25]. A "--panning" argument is given to allow this new area be interacted with.One of my bash aliases: written in desperation after reading the xrandr man page. For use with monitors that have archaic max resolutions.— oioiiooixiii (@oioiiooixiii) January 19, 2016
To return the screen to default settings, the following alias is used:
# LessDPI - reset resolution to default alias lessdpi="xrandr --output VGA1 --mode 1024x768 --scale 1x1 --panning 1024x768"xrandr man: http://www.x.org/archive/X11R7.5/doc/man/man1/xrandr.1.html
Angle Dance Karaoke
link: https://www.youtube.com/watch?v=n0A--mdo3jgSomeone made a karaoke version of "Angle Dance" https://t.co/uDTgbvpDjt pic.twitter.com/HYjEanekPN— oioiiooixiii (@oioiiooixiii) April 11, 2016
FFmpeg: Recursive video capture
#!/bin/bash # Rudimentary script for recording X11 desktop looping through FFplay # For other platforms see: https://trac.ffmpeg.org/wiki/Capture/Desktop ffmpeg \ -video_size 1024x768 \ -framerate 25 \ -f x11grab \ -i :0+100,200 \ -c libx264 \ -preset ultrafast \ -crf 6 \ recordscreen543.mkv \ -y & pid="$!" trap "kill $pid" EXIT ffplay \ -video_size 1024x768 \ -framerate 25 \ -f x11grab \ -i :0+100,200 exitLonger video demonstrating the effect.
Angela Merkel enjoys VR gaming
alt. audio: https://youtu.be/WRU_w1PCVbU
source video: https://www.youtube.com/watch?v=BOJYjq58BTM&t=56s
source video: https://www.youtube.com/watch?v=UMZSjX2R61E&t=2m20s
The Irish political ship is adrift
Ireland is still without a government; the elected can't agree. It's vital an agreement is reached soon... pic.twitter.com/TSkeI2rFXq
— oioiiooixiii (@oioiiooixiii) April 13, 2016
FFmpeg: Display and isolate macroblock motion-vectors in mpeg video
# Isolate motion-vectors using 'difference128' blend filter # - add brightness, contrast, and scaling, to taste ffplay \ -flags2 +export_mvs \ -i "video.mp4" \ -vf \ " split[original], codecview=mv=pf+bf+bb[vectors], [vectors][original]blend=all_mode=difference128, eq=contrast=7:brightness=-0.3, scale=720:-2 "Works best with higher-resolution videos; 4K source used in this case.
more info: https://trac.ffmpeg.org/wiki/Debug/MacroblocksAndMotionVectors
source video: Czech National Ballet - https://www.youtube.com/watch?v=bn12Ffi15Go
shorter alt. version: https://www.youtube.com/watch?v=KN_c4mdBpvg
What if Donald Trump is Putin?
context: https://twitter.com/adamjohnsonNYC/status/715649757167951877What if, Donald Trump IS Putin? Like in a rubber mask. That's how they both can be Hitler! It all makes total sense pic.twitter.com/rugrEkVWh8
— oioiiooixiii (@oioiiooixiii) March 31, 2016